Dealing with Lazy Survey Respondents — Drop Them or Keep Them?
It might seem obvious that if your survey respondents are inattentive and not reading your survey questions, then you ought to delete them from your data. “Get rid of them!” we often say. “Give me replacements who I can count on to focus on the questions and offer thoughtful answers!”
The problem, however, is that deleting inattentive respondents can introduce biases and undermine the validity of research findings. Why? Because often they represent a unique category of people who, if you cut them, make your sample less representative.
A 2022 research article in Public Opinion Quarterly written by two political scientists explored this conundrum with a unique study that measured survey inattentiveness, and then compared what people say vs. what they actually do via external voting records.
Here is a summary of what they found:
Respondents failing attention checks are more likely to misreport various factual information, although many of these inattentive respondents nonetheless provide responses in line with the information in the administrative records. An important variable that influences the impact of dropping inattentive respondents from analyses is whether or not attentiveness correlates with the construct of interest. For turnout histories in recent elections, which correlate with respondent attention, dropping inattentive respondents leads to an unrepresentative subsample and, hence, estimates with larger biases and variances. By contrast, for modes of voting in recent elections, which are largely uncorrelated with attention check passages, dropping inattentive respondents yields estimates with smaller biases that often outweigh the cost of larger variances.
What does this mean? It means that deleting inattentive respondents from your data can sometimes make it worse. And sometimes deleting them can make the data better. And the problem, of course, is that you rarely know which.
The takeaway for us at Versta Research is this: Remove inattentive respondents with an exceedingly light touch. It is normal and expected that people will misread questions, lose focus, get distracted, and answer inconsistently. Let your survey capture this normal level of inattentive error.
At the same time, however, aggressively remove fraudulent respondents. Fraudsters are mostly easy to spot with a careful, case-by-case review, and there is no up-side to including fake answers in your data.
—Joe Hopper, Ph.D.
OTHER ARTICLES ON THIS TOPIC:
How to Trap Survey Trolls: Ask Them for a Story
How to Find and Eliminate Cheaters, Liars, and Trolls in Your Survey
Finding Fraud in Public Polls: Our AAPOR Presentation
Are They Cheating or Helping? New Research on Survey “Cheating” Raises Thorny Issue