Versta Research Newsletter

Dear Reader,

Recently we learned of a clever experiment that offers a fun way to test your analytical skills. I took the test without knowing what it was about (I got the right answer!) and in the weeks since, it has dramatically shaped my perspective on our work.

We have reproduced it here for you to try. I guarantee that A Quick Puzzle for Market Research Brains will niggle in your brain for weeks to come. And unless you’re already a seasoned philosopher of science, it will likely change the way you see research.

Other items of interest in this newsletter include:
We are also delighted to share with you:

This section highlights our presentations at the 2015 LGBT Journalists Convention and the 2015 Corporate Researchers Conference. It also has all the latest news about our research for Wells Fargo and Teva Pharmaceuticals.

As always, feel free to reach out with an inquiry or with questions you may have. We would be pleased to consult with you on your next research effort.

Happy autumn,

The Versta Team

A Quick Puzzle for Market Research Brains

There is an old experiment that was published 55 years ago in The Quarterly Journal of Experimental Psychology that, to my mind, proves good researchers are a special breed. Here is the experiment. [1]

We have established a rule, and provided three numbers below that conform to the rule:

Obeys the rule

Your task is to discover the rule by entering additional sets of three numbers. After you enter them, we will tell you whether your numbers conform to the rule. You may enter as many sets of numbers as you wish. When you think you know the rule, click Reveal the answer and we’ll show you the answer.

You can test your numbers here:
{{g.phrase}}

The Answer

Here’s the answer: The first number must be less than the second number, which must be less than the third number. In mathematical notation: a < b < c. How did you do?

Not many people get it right because there are so many rules that can explain the sequence 2, 4, 8. Most people see a pattern, test it once or twice, see confirming evidence, and then state the rule. In 1960, when this experiment was first conducted, only one in five college students who participated in the study correctly identified the rule.

But what differentiated those who got it right? They looked for disconfirming evidence. There are so many possible rules that you are unlikely to guess the correct one until you start eliminating some. To succeed you need to try and try again until you have several sets that violate the rule.

This is the essence of scientific thinking and it is crucial to the research process. We collect data, analyze it, look for patterns, and offer up explanations. But those explanations are likely to be wrong unless we have tested and eliminated alternative explanations.

There are so many possible rules that you are unlikely to guess the correct one until you start eliminating some.

The take-away from this exercise is that great researchers look for disconfirming evidence. Whether you arrived at the right answer in this exercise matters less than your thought process in getting to that answer. As the original authors of the experiment noted: “The point is not that most subjects failed to give the correct rule at their first announcement, but that they adopted a strategy which tended to preclude its attainment.”

So, if you did not test number sets until you had at least one set come up “No,” make this your mantra for the future: Look for disconfirming evidence.

How? When? Where? And what’s this got to do with the daily work of market research? Here are just three suggestions for places to seek disconfirming evidence that will make you a better researcher.

How to Find Disconfirming Evidence

1. Punish test your questionnaires. You should never feel satisfied with one or two successful tests of a survey. No mistakes? Ha. Unlikely. Don’t believe it. For screening questions test every combination of possible answers. Same goes for questions that drive branching logic. With numeric entry boxes, test for valid ranges and for logical dependencies. Test your back buttons; see if data is replaced correctly when you move forward again. Use a Random Data Generator (RDG) to simulate data with a few thousand cases to test patterns of responses you never thought of.

Ideally you need to subject your questionnaire to as much scrutiny and variability as you will get in real fieldwork. Then you should review your data with as much attention to detail as when you are cleaning and coding the final results. If others are testing for you, ask them to provide a full accounting of what they tested and the mistakes that were corrected. If they can’t find disconfirming evidence before declaring that everything is right, ask them to keep testing.

2. Use control variables in your analysis. So you get to the final presentation and show data that first-time attendees to the decennial fundraiser donated less money than repeat attendees. Your hypothesis? Repeat attendees are more committed, so they donate more. But wait, somebody asks about age. First time attendees are probably younger than repeat attendees, and they probably earn less. Maybe that’s why first-time attendees donated less money.

Your hypothesis might be right, but it will be much stronger if you look for disconfirming evidence by controlling for age. Indeed, there are many possible explanations that could explain your findings. Good researchers brainstorm the alternatives and punish test them against the data. If you can find out which explanations are wrong, then your final answer is more likely to be right.

3. Scrutinize your data collection and coding. Whenever empirical findings are super strong or super surprising, good researchers start with the more plausible assumption that they did something wrong. Maybe the sample was skewed, or the panel supplier messed up quotas. Maybe the questions were leading. Maybe the skip logic functioned incorrectly. Maybe the data were labeled and coded in reverse (it happens all the time!) so the findings are really the opposite.

Great researchers seek disconfirming evidence in all those places where humans (and the machines created by humans) make mistakes. So, always review your programming. Confirm that on-screen images match how they are labeled. Review demographics to ensure a match to your population. Cross check tabulations and statistical modeling by running them on a different package (WinCross or R versus SPSS, for example). Punish test the project, search for errors, and look for evidence that the surprise in your data is wrong.

A crucial tenet of scientific reasoning (and therefore of research!) is that no matter how much data and empirical evidence there is, one can never prove a theory to be true. One can, however, prove theories to be false. Therefore, a systematic, painstaking processes of elimination must be as central to our work as theory building, strategizing, modeling, measuring, testing, and storytelling.

Coincidentally as we drafted this newsletter for publication, the editor of the Journal of Marketing Research, a professor in The Wharton School at the University of Pennsylvania, prefaced the October 2015 issue of the journal with “A Field Guide to Publishing in an Era of Doubt.” His advice was this:

“You need to be able to show that you worked just as hard to try to find support for alternative theories as the one you favor.”

This is just as true for those of us sharing and presenting our research findings in corporate settings as it is in the academic world. Great researchers of every stripe seek disconfirming evidence at every stage of the research process. The ingenious little experiment we’ve replicated above is a potent reminder of that.

[1] Many thanks to the New York Times for calling our attention to this experiment and for the idea of simulating this experiment with a web-based interactive tool.

Stories from the Versta Blog

Here are several recent posts from the Versta Research Blog. Click on any headline to read more.

Gallup Gives Up as Phone Surveys Fail

Having failed miserably with its telephone-based election polling since 2010, Gallup will not conduct polling this time around. Yes, phone polling is dead.

Tricks for Getting Truthful Respondents

When money is involved and respondents may be tempted to lie, two recent studies show how to activate a truth-telling commitment among your survey respondents.

Here’s When Pie Charts Work

Usually we recommend avoiding pie charts. But in some cases they can be extremely effective and superior to other charts, as this example demonstrates.

Read Your Questionnaires Out Loud

A great way to improve your questionnaires is to read them out loud. It will help you find the right conversational tone that puts respondents at ease.

Finding the Story and Getting It Noticed

Here are two USA Today infographic snapshots from a survey we did with Wells Fargo, plus a preview of our upcoming session at the Corporate Researchers Conference.

Defending Your Online Samples in Court

If you ever need to defend online samples to doubting clients or colleagues, remind them that non-probability samples ARE accepted as scientific evidence in courtrooms.

Defending Your Statistics in Court

A good test of rigorous research is whether it can withstand scrutiny in a court of law. Here’s the statistical knowledge you need for your expertise to hold sway.

The Problem with Fancy Segmentation

Segmentation algorithms are amazing tools, but please beware of overemphasizing differences among segments if they are mostly similar in other important ways.

Get Rid of Those Survey Speeders

New research shows that online survey speeders add random noise to your data. But “randomly” bad data does not mean you shouldn’t clean them out. Get rid of them!

Gallup Pays $12M in Phone Survey Lawsuit

If you ever do phone surveys, take note: it is best to manually dial all phone numbers or you risk a class action lawsuit like Gallup just got hit with.

Target Your Surveys with Google Stalker

You can now follow specific website visitors and target them for follow-up surveys using the same technology that Google uses to serve up targeted ads.

Surveys Stalk You Everywhere

Asking customers to evaluate every interaction with you is NOT customer centric, because it is irritating and focused entirely on your own need for “metrics.”

Versta Research in the News

Findings Presented at the 2015 LGBT Journalists Convention

Versta Research shared results from a recent survey of LGBT Americans about how marriage laws are changing the financial landscape for same-sex couples.

News Coverage for Survey on Money and Marriage

Versta Research’s survey for Wells Fargo in advance of the Supreme Court ruling on same-sex marriage was reported by CNBC, FA Magazine, Black Enterprise, and Curve.

Physicians and Pain Sufferers Open Up About Rx Abuse

Versta Research conducted a survey about prescription drug abuse for Teva Pharmaceuticals, the American Academy of Pain Management, and the U.S. Pain Foundation, which is featured on the Pain Matters website with an infographic that highlights research findings.

Turning Data into Stories at the Corporate Researchers Conference

Versta Research teamed up with Wells Fargo to talk about new insights and a story building methodology for research that helps drive business and wins media coverage.

MORE VERSTA NEWSLETTERS