Versta Research Newsletter

Dear Reader,

The holy grail of market research (and academic research, too) is to bring together the unique strengths of both quantitative and qualitative methods. We rely on quantitative because numbers are such an efficient and elegant way to convey information. We crave qualitative because it offers depth, richness, and insight into the lived reality driving the numbers. In this newsletter we offer our perspective on how to bridge the quantitative-qualitative gap.

Other items of interest include:

While you are here, also take a look at Versta’s recent publications, which run the gamut from tips about marketing and public relations to findings in an academic journal.

Need help putting numbers to your stories? Or telling a story with your data? Give us a call. We are happy to help.

Sincerely,

The Versta Team

Bridging the Quantitative-Qualitative Gap

“You may have heard the world is made up of atoms and molecules, but it’s really made up of stories. When you sit with an individual that’s been here, you can give quantitative data a qualitative overlay.”
– William Turner, 16th century British scientist and naturalist

In previous newsletters we have written about Turning Data into Stories, which achieves exactly what Turner describes. Research stories give quantitative data an overlay of meaning and context that are critical for research to be accurate, heard, and understood. If Turner were alive today, he might well say the same about qualitative data. Stories help synthesize data that might otherwise be an endless, meaningless transcript of what he said and then she said and then he said and so on. In our view, if you can then take that synthesis and quantify it, those stories will be powerful and compelling indeed.

This is the holy grail of research, whether academic or market research: to bring together the unique strengths of both quantitative and qualitative methods. It does not seem like these two methods should (or need to) live in different worlds, and yet they do for most. And for clients on the business side (though not necessarily on the research side), it is a source of frustration. They fund both kinds of research, get valuable insights from each, but neither type of research is drawing upon the strengths of the other.

Achieving the Strengths of Each

What are those strengths? On one side, quantitative research helps us summarize information and trends in simple and compelling ways; it allows us to model relationships of many variables that are difficult to “see” without numeric models, and it offers amazingly accurate estimates of whole populations using sampling and statistical methods.

On the other side, qualitative research gives us deep insight into how and why people think, feel, and behave as they do. It helps us understand meanings and motivations, and does a much better job than quantitative research in fleshing out the context, history, and implications of what we are trying to understand.

How do you bring the two together? In our view, you bring them together by investing in people who are trained in both methods and who have deep experience in both. Moreover, it is not enough for these people to be merely familiar with both quantitative and qualitative approaches. They need to have engaged, succeeded, and communicated with clients having done both types of research projects.

On the qualitative side this means having done work involving deep interaction with people. Good social and market researchers need to develop a keen sense of what people are saying, feeling, thinking, and meaning when they give researchers “data.” They need to understand how everyday activities might be reflected in larger patterns of quantitative data. They need to see how aggregate numbers emerge from individuals in the world. These are skills honed through ethnographic work, in-depth interviews, and focus groups; they involve skills of empathy, active listening, curiosity, and probing.

On the quantitative side it means having done work in inferential statistics, numeric modeling, and hypothesis testing. Good social and market researchers need to become so familiar with numbers that they scan data sets and frequencies and tables, and sniff out what doesn’t make sense. They need to understand the math behind the methods to appreciate when statistical procedures are variations on each other, which procedure is best, and how to engineer custom approaches when the data require it. They need to be “numbers people” who quickly develop a “feel” for their data, as surely as they are “people people” who develop a feel for culture.

The False Promise of Technology

What about technology? Can new survey and analysis tools take the complicated math out of quantitative research? Can text analysis and media tools that analyze transcripts and tweets give us the richness of what real people are saying? Sort of. They surely make our work faster and easier. But behind these tools lies the same complexity of numbers and meaning, and the same messiness of reality that requires thoughtful expertise to make sense of. Technology cannot tell you how the numbers fit together into a meaningful story, or suggest clever ways to re-code data to examine an odd hypothesis, or assess whether your findings might reflect choices in survey design, randomization protocols, or errors compounded by weighting.

Indeed, eleven years ago when working in the world of academic research, we tackled the quantitative-qualitative gap in one area of sociological research with a grant from the National Institutes of Health. NIH funded this work as a three-year professional training grant, not as a technology development grant or data collection grant. The missing piece was people who were fluent in both methods, who could live on either side of the divide, and who could bring quantitative insights into the everyday world and vice versa.

Some Practical Suggestions

Of course the people who bring together both quantitative and qualitative methods use a number of techniques to build multiple layers of data and insight into their work. Some are classic and some are newer. We will continue to offer ideas, reviews, and examples on these pages, but consider just a few of them:

  • Include good open-ended questions in surveys, with skillful wording and probing. Go beyond asking, “Why do you rate it very good?”
  • Consider real-time interviewer probing during online surveys, similar to instant chat for help while browsing a website.
  • In addition to deep textual analysis, code and tabulate important dimensions of in-depth interviews or focus group transcripts. For example, describe how many expressed negative views or particular themes, and how extreme they were.
  • Include a short survey as part of your focus group or ethnography, or do a short survey as a follow-up to quantify and measure important themes.
  • Consider a custom MROC (Market Research Online Community) which provides an ongoing opportunity to get rich, open-ended textual feedback and survey data from an engaged and active audience.
  • As part of your survey effort, build in a dozen in-depth interviews before the survey is designed or after the data are analyzed.
  • Consider taking an opposite approach to your initial idea – if you’re thinking focus groups, how about a conjoint study? We did this not long ago and management was delighted with the insight they got (a mathematical model of what people want, as part of a qualitative story they already knew).

We rely on quantitative because numbers are such an efficient and elegant way to convey information. We crave qualitative because it offers depth, richness, and insight into the lived reality driving the numbers. In a fragmented world of specialization and niche marketing, finding people with superior skills in both methods seems like a tall order, but it strikes to the core of what good research means. Research is about asking questions, finding information and data, thinking about it, probing deeper, assimilating and synthesizing it, and then communicating answers to people who care. The key is to find smart people who love to immerse themselves in the process and who love math and the humanities and the social sciences, and who love helping people like you answer questions.

Stories from the Versta Blog

Here are several recent posts from the Versta Research Blog. Click on any headline to read more.

Cross Cultural Survey Guidelines

International market research requires attention to best practices for cross-cultural research design and analysis. New guidelines are available.

Can Tweeting Replace Polling?

Scientists have shown that measuring social media can sometimes align with data from consumer polls, suggesting new opportunities for market research and polling.

Trouble for Phone Surveys: Nobody Talks

Phone-based surveys face two challenges today: (1) The profusion of cell-only households and (2) The decline in people using phones to talk.

Click Here for Actionable Insights!

Applications like Survey Monkey or Zoomerang often confuse the tools of research and polling with the interpretation and outcomes of research.

Of Lust and Tracking Studies

A tracking study should not be a routine effort to deliver data and charts, but a test of whether a research firm can really add value and insight.

Writing Successful Omnibus Survey Questions

Writing omnibus survey questions can be tricky because you have just a few questions to get your nuggets of data. Here are four tips to help.

More Research on Phone vs. Online Surveys

A scientific study of computer vs. human interviewing shows better quality data in three key areas for self-administered computer surveys.

Don’t Do Research in Your Sleep

A colleague in market research once complained to me that he was bored by client satisfaction and loyalty research and he could do it in his sleep…

How Long Should a Survey Be?

Research consistently shows that surveys should be kept under 20 minutes, and that longer surveys lead to lower quality data.

Five Research Design Tips

Good research depends on excellent research design. Here are five design tips from for putting together a superb market research project.

Practical Statistics vs. Theoretical Statistics

Are the statistics from online panels projectible? AAPOR says no, because there is no theory to explain how it works. But in practice, estimates from online panels often predict population values.

How Good Are Online Survey Panels?

This is a summary of conclusions and recommendations from AAPOR’s March 2010 report about the quality and uses of online survey panels.

Two Keys to Writing Great Research Reports

Research reports should be short and to the point, but also richly nuanced to capture the complexity of reality. How do you do both?

Recently Published

Here’s a look at Versta’s recent publications, which run the gamut from tips about marketing and public relations to findings in an academic journal.

The End of Marketing Middlemen?

Versta outlines what it will take for marketers to survive recent shifts driven by the internet. This article was published in the Spring 2010 issue of Interface magazine, from the Chicago chapter of the American Marketing Association.

Patient Survey about Ulcerative Colitis Published

Results from national surveys of patients with a variety of chronic conditions including ulcerative colitis, rheumatoid arthritis, asthma, and migraine headaches were published in the April 2010 issue of Digestive Diseases and Sciences.

Designing Surveys for PR Stories

Versta’s how-to article for PR professionals was published in the March, 2010 edition of Public Relations Tactics, a publication of PRSA.

MORE VERSTA NEWSLETTERS