One of the biggest debates in market research and opinion polling right now is about the quality of online survey panels. It is a critically important topic because chances are high that you are relying on these panels for insight and information.
Versta Research has always been a strong proponent of using online panels because for most types of research they are robust, accurate, and provide numerous advantages over other data collection modes. A number of studies have recently affirmed the quality of online panels, which we review here. In a future newsletter we will present an alternative view.
Plus we offer links to several items of interest, including:
- Three ways to improve verbatim data
- How to get better data with better survey design
- The beauty of conjoint analysis
- The future of market research if “middlemen” are doomed
- Washington Post guidelines for reporting results from public opinion polls
- The bright future of those who are fluent in statistics
Research methods and best practices change more quickly than ever before. If you need help thinking about the best approach for your research needs, give us a call. We are happy to consult before you make any decisions about how and whether to proceed with a project.
The Versta Team
How Good Are Online Survey Panels?
Ten years ago most survey research was done by telephone. Now the majority of surveys are done over the Internet, and a many of these surveys use respondents sourced through online panels.
What is an online panel? It is a group of people who have expressed a willingness to participate in surveys over the Internet. Instead of calling up random people on the telephone and asking if they would do a survey, we reach out to a random sample of pre-screened panelists by e-mail to ask if they would do a survey.
Besides clients and companies who maintain their own lists of customers and prospects who might be invited to do surveys, there are specialized online panel companies who recruit and maintain relationships with millions of survey volunteers. They are able to create samples that closely match the overall demographics of the U.S. as a whole, or of a specific population.
For many years, important questions have been raised about the quality of online panels. Are they any good? Can we get reliable and representative data from them? Do they work as well as phone surveys?
The Advertising Research Foundation has just released findings from a million-dollar intensive study to answer these questions. They fielded a two-wave survey across 17 online panels. We summarize three key findings here.
FINDING 1: Online panelists represent a broad spectrum of the population.
In the past there have been concerns about panelists being a small group of professional survey takers who join as many panels as they can to take as many surveys as possible. You want to be sure that the people who answer your survey questions represent a broad population, not a small group of people who have nothing better to do.
The ARF research shows that panelists represent a broad cross section of the population, including all demographic groups and regions of the country. According to the study:
- The vast majority (84%) of survey panelists belong to a single panel
- Nearly all panelists are diligent, thoughtful, and careful about representing their opinions and behaviors, including those who belong to more than one panel
- There are five and a half million unique respondents available via panels for surveys in the U.S. This is comparable to the estimated number of people who are willing to complete phone surveys
Of course if sample is sourced through more than one panel, it is possible (but highly unlikely) that a specific person could take a survey twice. Versta Research has validating procedures to ensure that we identify these cases and delete the duplicate response.
FINDING 2: The vast majority of online panelists provide valid and reliable data.
The ARF study shows that online surveys produce high quality data, and that panelists give thoughtful, realistic, and true responses. In fact, the study showed that the single most important factor that hurts survey quality is entirely in the researcher’s control: survey length. This is not surprising, as research has shown for many years that data quality from both phone and online surveys declines considerably if surveys are too long.
The study also found that certain types of panelists give higher quality data than others including older panelists (age 50 to 65), those who have positive attitudes towards surveys, those who take more surveys per month, and those who are on more than one panel.
It should not be surprising that having panelists who are engaged and who like taking surveys gives us better data, and it is valuable to have research that affirms this.
FINDING 3: All online panels are not the same.
Online panels often specialize in recruiting and supplying specific types of respondents, such as IT professionals, physicians, or high-net-worth consumers. But even among comparable panels who are supplying the same types of respondents, the ARF study showed that panels can be slightly different on attitude and opinion dimensions, including intentions to act, reactions to concepts, and so on.
Carefully weighting the data to correct for imbalances in age, gender, education, income and/or region helps, but does not eliminate differences. This finding highlights the need to work with panel companies who adopt aggressive best practices when it comes to panel management, and who work hard to recruit cross sections of the population. In the words of the ARF report, “The findings suggest strongly that panels are not interchangeable. Guidelines and transparency about sourcing is needed, [especially] when blending samples or [when] multiple panels are used to fulfill sample requirements.”
The Bottom Line.
The analysis of the ARF data is still underway, and no doubt there will be additional findings and much discussion about its implications. No research method is perfect, and nearly always you need a smart and experienced team making decisions about which online panel to use, how to manage the process, and how to work with the data once you get it. But so far the research provides good news for those who rely on Internet surveys for insight.
Do you want to be a survey panelist?
One of Versta Research’s partners for online survey respondents is Survey Sampling International, a company that has been in the business of survey research from before the Internet was around. You may volunteer to be a member of their panel at: www.surveyspot.com
Recent Items on the Versta Blog
Here are several recent posts from the Versta Research Blog. Click on any headline to read more.
One way to improve data quality is by designing surveys that speak to respondents with thanks and encouragement. Here are four suggestions for how to do this.
Conjoint analysis builds mathematical models of choices and preferences. It helps you optimize products, understand trade-offs, and identify consumer segments.
Online surveys may be replacing focus groups because focus groups in the past have been used for simple package or concept testing rather than group ideation.
Online surveys can provide rich data to open-ended questions. Here are three proven tips to writing and formatting such questions to maximize your data.
Market research firms that add significant value by delivering insight and understanding will survive the dramatic economic shifts being driven by the Internet.
Besides offering a biased question, a computerized telephone poll of Chicago residents about Walmart relied on a sampling method that likely resulted in poor data.
Reporting on Research (Both Serious and Silly) and More
Here are several recent articles on market research. Click on any headline to read the full article.
“Polls are proliferating, and so are problems with the way they’re conducted and reported,” reports the Washington Post, which has provided news staff with updated standards for what it will and will not publish when it comes to polls.
In 2007 Versta Research’s president led an effort to document misperceptions among smokers about the risks of smoking vs. the risks of quitting using nicotine replacement therapies. Four years later the data are still yielding new findings and being published in academic journals.
Researchers who are fluent in statistics are becoming increasingly valuable and “changing the image of the profession.”
The New York Times reports, “Marketers know that surprise giveaways go a long way. In a paper soon to be published in The Journal of Consumer Research, three researchers show that reactions vary widely across cultures to this kind of surprise.”
MORE VERSTA NEWSLETTERS