Versta Research Newsletter

Dear Reader,

During a presidential election year there is no escaping the flurry of public opinion polling and the intense scrutiny that surveys get from the media. But love it or hate it, there are excellent reasons to pay close attention to this year’s political polling.

Pollsters are grappling with the thorniest issues measuring peoples’ attitudes and behaviors amidst the most rapid change in technology and communications that we’ve seen in the last thirty years. They face the same issues that all market research professionals struggle with day in and day out, under the glare of spotlights, expert criticism, and final validation come Election Day.

Versta Research’s Fall 2012 Newsletter highlights 5 Research Lessons from Election Season Polling that may give you a new perspective on political polls and offer some insights to help you in your own work.

Other items of interest in this newsletter include:

We also highlight some of Versta Research’s  recent surveys for the American Chronic Pain Association and Pfizer.

As always, when your next research project comes along, please do not hesitate to give us a call for our thoughts and a proposal. We will do you our very best to earn your vote and your confidence.

Happy Voting!

The Versta Team

Five Research Lessons from Election Season Polling

Every four years, the American public becomes engrossed in some of the same topics that research professionals obsess about every day—things like sampling, declining response rates, margins of error, and questionnaire wording. People who would otherwise avoid discussions about statistics start wondering: Are the respondents in political polls really representative of everyone? How can you conduct a poll when nobody answers the phone? Are differences statistically significant, or are the candidates locked in a tie?

And it’s not just the general public that gets engrossed. We do, as well, because election season presents a perfect opportunity to learn from some of the brightest people in our field who are at the forefront of using scientific methods to answer important questions that people care a great deal about. As we noted in an article not too long ago (entitled “Your Margin of Error is Probably Wrong”):

“Even if you are not involved in political polling, it is worth paying attention to the methods and best practices of political pollsters. One reason is that few other areas of research offer a way to completely validate one’s methods. Pollsters are using sampling and survey methods to predict the behaviors of a much larger population. Then in just one day that population behaves, we get a near-perfect count of exactly how they behaved, and we know whether the methods worked.”

Here, then, are five important lessons from election season polling that we believe apply to all types of market research, and that profoundly affect the work we do at Versta Research:

1. Ask the Right Question (of the Right People)

More than any other type of research, political polling highlights the importance of asking the right question. Legitimate variations will affect the outcome in substantial ways. For example:

  • Should you ask respondents who they will most likely to vote for in November?
  • Should you ask respondents who they would vote for if the election were held tomorrow?
  • Should your question use Obama’s and Romney’s first names, as they will appear on ballots?
  • Should your question refer to each candidate’s party affiliation, as they will appear on ballots?

Likewise, who should your respondents be? All adults of voting age? All registered voters? Any adult who intends to vote? Registered voters who voted in past elections? And, of course, what questions will you use to identify these respondents?

The answers to all of these questions depend on why the survey is being done. There are many types of political polls, and multiple reasons for conducting them. Likewise, measuring consumer or voter attitudes is different from trying to predict behavior, which is different from trying to forecast market share. Your survey questions need to be designed accordingly.

2. Methods Matter

Beyond questionnaire design and sample selection, political pollsters rightly agonize over the full range of methodological details because each decision they make is likely to affect the accuracy, reliability, and usefulness of their findings. Nearly every new poll is subject to a barrage of methodological analysis and critique: Was the poll conducted online or by phone? Did they use live interviewers or automated dialing with interactive voice recordings? Was the sample selected through RDD (random digit dial) or targeted listings? What days of the week and at what times of day did they survey? Did they include mobile phones? How did they weight the data? And so on.

For a market research professional, all of this is a treasure trove of insight into the most important methodological challenges facing the research industry in a time of rapid technological and social change. It is a reminder that research methods matter, and that the challenges are not just about qualitative versus quantitative, or surveys versus focus groups. They are about design, fieldwork, respondent recruitment, data collection, data manipulation, statistical analysis, and interpretation as well. Each decision we make can be a potential source of instability and error that needs to be carefully managed.

3. It’s the Story that Counts

On a public relations front, election season polls prove that even “another” new poll about a topic that has been covered time and again can be important and newsworthy. Dozens of major polls are released every week. Most are trying to gauge the same thing. All of them are closely watched and widely reported. Why? Because the content is relevant, interest is high, and the stakes are huge. Moreover, each new poll matters, as it adds to or shifts the story about how voters are reacting, and how the campaigns need to respond.

If you are sponsoring a survey in support of a PR, marketing, or communications campaign, it is nice to find white space where you can study an issue and report polling data that nobody else has explored. But people will not care just because it is new. Conversely, many people care a great deal about the tried and true. How many surveys can there be, for example, about Americans not saving enough for retirement? In our experience, the topic is always relevant in new ways, and every variation on the last survey gets media play.

4. Vendors Differ

One striking feature of political polling is that not all polls agree, and careful analysis of multiple polls over time demonstrates what researchers call a “house effect” on results.

Even if research firms are asking exactly the same questions and trying to project their results to the same population, they make somewhat different decisions about design and analysis. As an example, Mark Blumenthal of the Huffington Post documents three non-partisan (and seemingly small and arcane) decisions about statistical weighting that leads the Gallup organization to show consistently stronger support for Republican candidates compared to other polling organizations like Pew. Is Gallup biased? No more so than any other firm. As the Gallup Poll’s editor-in-chief noted:

“There are reasons that would argue for and against taking both courses of action. . . . Our methodologists certainly take into account all the pros and cons of the various decisions involved in sample weighting. We are constantly reviewing our procedures and making the best, well-informed judgments on changes.”

Good research is replicable and rigorous, but the process itself is about finding answers to unknowns. As such, research organizations make different decisions about even the smallest aspects of their work, which invariably affects their findings. Ideally your research vendor has the smartest and most experienced people making those decisions—the ones who understand and can explain how their decisions may affect the results.

5. Best Practices Change

Political polling in 2012 is beset with practical and technical challenges unlike anything we have seen in research over the last thirty years. Phone surveys used to be the gold standard for public opinion polls, but it is increasingly difficult to conduct them with the same levels of rigor as in the past. Best practices for polling have been upended by technological changes, demographic shifts, and lifestyle changes.

How, for example, should a pollster handle an interview if a person is reached on a cell phone, but also owns a landline? The issue is complicated because multiple phones gives a person a higher probability of being sampled, and cell phone users are demographically different from the rest of the population. An article in the New York Times recently documented the issue, noting:

There is no consensus on the right method for handling such polling. The ABC News/Washington Post polls, like the NBC News/Wall Street Journal poll, terminate calls if cellphone respondents say they also have land lines. The New York Times/CBS News Poll, like the Gallup Organization and Pew Research Center, does not. Instead, Times/CBS pollsters complete interviews with all willing cellphone respondents and “weight” the views of those without land lines to make them reflect one-third of the survey’s results. Others remain skeptical that one-third is even the right target for “cell only” voters.

Paul J. Lavrakas, president of the American Association for Public Opinion Research, summed it up perfectly when he said, “Anyone who claims there’s a best practice doesn’t know what they’re talking about. We as an industry don’t know.”

These problems and others bedevil all of market research. Just when we start settling into a set of best practices about sampling, call-back strategies, survey mode, or statistical weighting, the world changes and we’re experimenting again with new strategies to get the most valid and reliable data to answer our research questions.

A Professional Win No Matter Who Wins

No doubt election season polling is about voters, candidates, and campaigns. But there are also valuable lessons for researchers about design, analysis, inference, prediction, and reporting. So whether you love or loathe this season’s barrage of media and polls, we urge you to step back and focus on how election polling is being done. Read the fine print. Pay attention to what the top polling analysts are debating. Notice how streams of conflicting data are being critiqued and/or synthesized into meaningful stories. Come November 5, you will know a lot more than whether your candidate won or lost and whether the polls got it right. You will know about some of the most pressing issues and trends affecting market research and public opinion polling today. That is a clear win for you.

Stories from the Versta Blog

Here are several recent posts from the Versta Research Blog. Click on any headline to read more.

Dilbert Does Predictive Analytics

It may sound surprising, but sometimes a predictive model that offers 0% accuracy is better than a model with 50% accuracy. Dilbert illustrates with a coin toss.

Wait! Wait! Don’t Dump That Data!

To analyze data, you need variation. So before deleting “useless” data, consider whether it provides comparative leverage for deeper statistical analysis.

Conjoint Analysis Helps Apple Win $1B in Lawsuit

If you ever have trouble convincing executives about the value of research, share this: The $1B settlement was based on a study using conjoint analysis.

13 Threats to Survey Accuracy

Our industry is often obsessed with margins of error, but the margins we calculate account for only one source of potential survey error. Here are the other twelve.

Why You Should Avoid Pie Charts

Pie charts were invented in 1801 and are now a common way to display statistical data. But our brains do not process the information well. Avoid them if possible.

Doing Market Research with Social Media

Social media is the newest channel for research, and there are two types every market researcher should know: interactive and observational social media research.

High Response Rates May Hurt Your Survey

New research shows that efforts to boost survey response rates can actually decrease the accuracy of survey results. One reason may be respondent distraction.

Finding a Story in All Those Numbers

Math and stats should not be lots of boring numbers and tables. There should be cool patterns that suggest compelling stories, like this video demonstrates.

Response Rates Fall to New Low

New research shows that survey response rates are now extremely low, usually under 10%. But so far, this is not hurting survey accuracy. Here’s why.

Park Your Demographics at the End

A group of industry experts recently weighed in on current best practices and opinions (and the trade-offs involved) for placement of demographic items in surveys.

The Dumbest PR Survey on Earth

National Geographic just did a survey about aliens from outer space that damaged their credibility and diminished their brand. PR surveys should do the opposite.

Better Charts for MaxDiff Data

MaxDiff is a powerful survey technique that uses paired comparisons. Optimizing your presentation of data in a chart that tells a clear story is critical.

Versta Research Conducts Surveys for American Chronic Pain Association & Pfizer

Versta Research designed and fielded surveys for the American Chronic Pain Association and Pfizer about diabetic peripheral neuropathy, a painful condition that affects an estimated 26 millions Americans with diabetes.

MORE VERSTA NEWSLETTERS