The Problem with “Check All That Apply” Survey Questions
The problem (so we’ve been told) with check-all-that-apply survey questions is that respondents do not process the full list of response options carefully and thoughtfully. Instead, they glance through, knowing they’re supposed to find at least a few that are relevant to them. They check the ones that jump out (often at the top of the list, or at the bottom), feel satisfied that they’ve done their job, and then move on to the next question.
Well, that is what we have always believed! A new article in the Journal of Survey Statistics and Methodology calls into question this traditional wisdom in survey research. The researcher (and author of the article) conducted a randomized laboratory experiment comparing check-all-that-apply questions to an alternative format where respondents had to say “yes” or “no” to each item in the list. Technology that tracks eye movements, head movements, and eye gaze was used to measure cognitive effort and respondent attention invested in different parts of the survey questions
Here is how the researcher summarized her findings:
Analyses of eye-tracking data showed that respondents invested more time and cognitive effort—measured by fixation time and fixation count—in the FC format than in the CATA format. However, a more detailed examination of cognitive engagement on the different elements of the question—such as the areas of the question text, the response options, and the answer boxes—allowed a more detailed investigation of processing. Neither for the area of the question text nor for the area of the pure response options were differences in processing time found. This indicates that there is no difference in reading or understanding the content of the question, nor in processing the list of response options.
Yet, the researcher does find important measurement differences in the two formats. Consistent with previous research, “affirmative” responses are higher in yes/no formats than in check-all-that-apply formats.
So what’s a market researcher to do? Should you avoid one format or the other? No. At Versta Research, we use both formats depending on our needs. Each can be a good measurement technique for what we are trying to understand and report in our findings. Unsure of which measurement technique is optimal for your specific needs? Give us a call and we’ll share our thoughts.
—Joe Hopper, Ph.D.