Avoid This Mistake of Bad Marketing Science
Just one month after publishing our Quick Puzzle for Market Research Brains (SPOILER ALERT! If you have not tried the puzzle yourself, do it now before continuing to read!) which demonstrated the dangers of confirmation bias, I read a report from the National Science Foundation (NSF), released in May, that warned of the same thing. The report is called Social, Behavioral, and Economic Sciences Perspectives on Robust and Reliable Science and it describes exactly the phenomenon our Quick Puzzle demonstrates:
“Scientists may actively seek out and assign more weight to evidence that confirms their hypotheses and ignore or underweight evidence that could disconfirm their hypotheses. When the results of a study are not as expected, an investigator may be highly motivated to check over the data processing in search of accidental errors that can be corrected, whereas when expected results are obtained, such thorough scrutiny may be less likely, and errors may go undetected.” [bold added for emphasis]
How do you avoid being the sloppy scientist who looks mostly for confirming data? By adopting a skeptical approach and searching for disconfirming evidence everywhere. That includes looking for mistakes in the most boring place of all—data processing. For example:
- Check data labeling against the original questionnaires
- Check numeric coding for accuracy and for correct levels of measurement
- Check that missing data align exactly with intended skip patterns
- Check that null and missing values are accounted for correctly
- Check tabulations by comparing with descriptive stats from a second program
Seems obvious, right? But many never do it. And, as the NSF report points out, you should be checking data and looking for mistakes even when everything seems right and conforms to your expectations. Make it your goal to find mistakes because “disconfirming evidence” is essential to finding insights instead of illusions.