Here’s the big (with more content than ever before) beautiful (on a snazzy new website that also keeps a vintage newsletter look) summer issue of the Versta Research newsletter. It features 19 easy learning guides that will teach you everything from how to ask gender on surveys to optimizing the price of your new product. It’s the Big Summer “How-To” Issue with 19 Easy Guides, because ultimately the bulk of what market research professionals know about our work is learned from other researchers on the job, in the field, and behind our desks.
Other items of interest in this newsletter include:
- Paying Respondents Can Lower Your Survey Costs
- How Random Correlations Ruined a Gender Story
- Why You Should Feel Patriotic Pride in Census Data
- Two-Point Scales Have the Highest Reliability
- Do Not Send Blind Survey Invitations
- How’s Our Relationship? (Survey Invite Review #3)
- How Amazing Are We? (Survey Invite Review #2)
- How Bad Are We? (Survey Invite Review #1)
- Getting to Yes When Response Rates Plummet
- How to Measure Shopper Strategies
- Three Easy Lessons from a Bad New York Times Survey
- How to Fix Your Education Bias in Surveys
- More Evidence that Online Polls Work (Really Well)We are also delighted to share with you:
… which showcases some of our recent work highlighted in Forbes, Time, and on CNBC.
As always, feel free to reach out with an inquiry or with questions you may have. We would be pleased to consult with you on your next research effort.
The Versta Team
Doing Better Market Research:
The Big Summer “How To” Issue with 19 Easy Guides
Reading this newsletter, you may have noticed a fancier, friendlier, and totally updated look to the Versta Research website. With so many feature articles, and so much professional content for marketers and researchers produced during the course of our work, we wanted a better way of showcasing it for our readers and clients to explore.
In the process of revisiting this content and bringing it over from our old website to our new one, we started looking more carefully at the readership and engagement data. And what did we discover? Readers like you want help doing your work and learning new techniques, just like we do. By far, the most engaging articles we have written are “How-To” articles that lay out tips, tricks, processes, and formulas for doing market research you won’t find anywhere else. Our how-to articles consistently get hundreds of page views every week.
How-to articles consistently get hundreds
of page views every week.
That inspired us to make this, our summer newsletter, the “How To” issue for doing better market research. Take some time this summer to up your research game and learn something new. Here are 19 easy how-to guides we have written—some of them short and sweet, some of them richer and deeper—all of which can help you on your way to better market research.
HOW TO CALCULATE AN NPS MARGIN OF ERROR—It took several years and lots of companies jumping on board before NPS users started clamoring for a way to calculate margins of error for their NPS scores. And now, by far, this is one of our most frequently-read articles. We have some modest ambivalence about NPS having become an end in itself (a metric fetish) rather than a means to a research end. But if you need and use NPS scores in your job, here is how you can tell whether differences between groups and changes over time are statistically significant.
HOW TO LABEL YOUR 10-POINT SCALE—Using a 10-point rating scale is highly intuitive for people in our culture, but we always need to specify what the scale is and what the points mean. It is best if you label the end points (the zero, and the ten) using very strong language that discourages casual use of those endpoints. Label the midpoint (the five) as being a neutral point. And then label every point on the scale with a number, but not necessarily with words. And guess what? It is not really a 10-point scale. It actually has 11 points, which is exactly what you want.
HOW TO MAKE A BEAUTIFUL MARKET SHARE CHART—The R statistical package is well-known for having powerful charting capabilities—far superior to any other software we know of. Plus it is free, and it provides a more powerful set of analytical tools than you will ever need and get from other statistical software no matter how much you pay. There is a downside, of course. R takes time to learn and for most of us it is not easy to use unless you use it all the time and every day. But take a look at how we transformed this market share trend data from a mess into an intuitive thing of beauty.
HOW TO MAKE SPECTACULAR INFOGRAPHICS—Our advice on creating infographics for market research gets more attention than we ever imagined. In some ways that is puzzling, because in the four years since we developed our tips and tricks for a market research audience, we see few companies and research groups taking up the challenge of actually creating them. But there is clearly a hunger for better and more compelling data visualization, and this “how-to” guide gets at universal principles of brevity, storytelling, and excellent graphic design.
HOW TO ASK ABOUT AGE ON SURVEYS—This seems simple, but ascertaining a person’s age in a survey can be surprisingly tricky. It is so tricky, in fact, that top survey experts devote many hours of professional time debating the best approach for specific data needs. There are multiple ways you might ask, and lots of research documenting the sources of error with each. We outline the pro’s and con’s of the four most common approaches: asking age, asking year born, asking specific date of birth, and asking which age group a person belongs to.
HOW TO ASK GENDER ON SURVEYS—This used to be easy, but not anymore. Our society now recognizes a broad spectrum of gender identities, and good research allows respondents to say who they are. We consulted two key sources in revamping how we ask gender: The Federal Interagency Working Group on Measuring Sexual Orientation and Gender Identity, and the Williams Institute at UCLA. Based on this we lay out two suggestions for how to ask gender, including the approach we typically use, which offers four response categories in a closed-ended question.
HOW TO DESIGN BETTER SURVEY INVITATIONS—In this article we turn to some basic principles of marketing to offer tips on how to “sell” a survey to the people you are hoping will participate. Give them good reasons to participate, and tell them how it benefits them. Offer a clear call to action that makes it easy to click in and start. Put crucial information “above the fold” and push cluttering details into a separate section. Lay out a design that is beautiful and that adapts to multiple devices. If you are sincere in reaching out to your customers, the difference between a great invitation and a bad one could make all the difference in your response rate.
HOW TO ESTIMATE THE LENGTH OF A SURVEY—This is another of our top-read articles, especially since it was cited in the Federal Register as providing authoritative guidance on survey length. Research professionals absolutely need to know how long it will take respondents to complete their surveys. Why? Because nearly all survey pricing, from sample cost, to programming, to analysis, is a function of survey length. Estimating the length of a survey is easy, quick, and remarkably accurate. Just follow this handy guide, which we use almost every day in our own work.
HOW TO MAKE A SURVEY ADA ACCESSIBLE—Even with amazing new technologies like voice assistants, AI, captioning, and alternative input devices, making a simple survey accessible for people with disabilities is devilishly difficult. No matter how fancy your current survey platform (and no matter what the people who own those platforms tell you) the vast majority of your surveys are probably not accessible enough to pass third-party audits. We went through the painful process of developing full accessibility, and share with you some best practices from what we learned.
HOW TO FIND MARKETING GOLD IN THE IVORY TOWER—What is one of the best, untapped resources for nuggets of insight that will give your research a sharper edge? Scientific, peer-reviewed publications from the world of academia. In this feature article we review and summarize the best outlets we recommend for new insights and practical suggestions to make your research measurably better. It covers material relevant to public polling for PR research, strategic research for marketing, new product development, and the psychology of consumer behavior and decision making.
HOW TO SELL YOUR BOSS ON RESEARCH—You believe in the value of research, and you know your organization needs it. But how do you get your manager and internal stakeholders to buy into it (and pay for it)? We share with you a technique you can use to get everyone less focused on research (which is merely a tool) and more focused on the questions that need to be answered. As you get towards the end the process and share your results with the team, the value of research becomes clear as you lay out key questions, possible answers, and what action you might take based on each answer.
HOW TO PUBLISH YOUR PR RESEARCH—Beyond a big media splash, the pinnacle of successful PR research is seeing it published in rigorous peer-reviewed scientific journals. Yes, this can be done, and it guarantees an enduring impact that truly establishes you as a thought leader, not just an attention grabber. How do you do this? It takes careful attention to design and content at the front end, and more commitment to content development at the back end. This short how-to article lays out the three key ingredients you should focus on to make that happen.
HOW TO GET OUTSTANDING OPEN-END RESPONSES—It’s hard enough getting people to respond to your survey. So is it really possible to get them offering rich, detailed, personal, and insightful open-ends when they do? Definitely. With skillful survey design, respondents will open their hearts and tell you a full life story if you want. This how-to article outlines two secrets to prompting the very best open-end responses. Hint: Avoid asking too many questions about you and your company, because you are not all that interesting to them!
HOW TO BOOST RESPONSE RATES—One of the best things about reading Public Opinion Quarterly (see How to Find Marketing Gold in the Ivory Tower!) is that it teems with careful scientific experiments by academic researchers, many of which are focused on how to actually field survey research. What are the best ways to recruit? How should you deal with non-response and missing data? What is the impact of using non-RDD samples? In this article we summarize useful POQ findings about pre-notification strategies, incentives, and offering respondents multiple choices of how to participate.
HOW TO CONDUCT A TELEPHONE SURVEY FOR GOLD STANDARD RESEARCH—We wrote this brief how-to guide a full 8 years ago and you might think that phone surveys would be fully dead now. Not yet, and maybe never. Online sampling at the local level is still extremely hard, and in most geographies it is impossible. So here are the things you will need to know if you ever conduct a telephone poll: sample sizes and sample frames, techniques for randomized phone numbers (you cannot take a published list and simply randomize), strategies for randomizing respondent selection within the household, weighting, and so on.
HOW TO FIX THE UP-MEANS-GOOD SURVEY BIAS—In our culture, “things above” tend to be perceived as better than “things below.” Top is better than bottom. Up is better than down. Over is better than under. This measurably affects how people answer surveys. So if two concepts are shown to respondents on screen, most will give better scores to the concept placed visually higher. This article lays out the recent scientific research about this type of bias (an important source of survey error) and offers two simple approaches to counteracting it.
HOW TO KNOW IF A BRAND EXTENSION WILL SUCCEED—There are many variables that can affect whether a new brand extension or line of products will succeed. There is level of marketing support, distribution, and strength of the parent brand. But it turns out that the best predictor is something you can easily measure on a survey. Moreover, that best predictor is not whether people like the idea, or whether they would buy the new product. You simply need to ask a question about “fit” to gauge whether buyers think it makes sense within the scope of what your company already offers.
HOW TO DESIGN AN EXCELLENT CHART—Charts are extremely (and surprisingly) difficult to design well. Why? Because a good chart must always be constructed with a specific objective about the story you want to tell. Sorry, automated tools cannot do this, and never will. This feature article discusses when to use charts (and when to avoid them), how to find charts that will highlight the points of contrast you want to display, and it lays out a bevy of best practices around labeling, colors, dimensionality, white space, graphic design, grid lines, and so on.
HOW TO OPTIMIZE YOUR PRODUCT PRICING—This how-to feature video was created to demonstrate a step-by-step process once you have some basic survey data on pricing. First, estimate how many people will buy your product at a few different price points. You can use a basic survey for this. Then plot a simple demand curve. Believe it or not, PowerPoint can be a good tool to help you with this, and we show you how. Then use a statistical package (try R for free) to find the point along the curve that will optimize how much revenue you can generate. That point is your optimal price.
Stories from the Versta Blog
Here are several recent posts from the Versta Research Blog. Click on any headline to read more.
Paying Respondents Can Lower Your Survey Costs
A new study from the Department of Energy on recruiting households to participate in survey research found that paying them more actually lowered the cost.
How Random Correlations Ruined a Gender Story
Random correlations are sometimes funny, but rarely informative. Sometimes they can even work against you, by suggesting the opposite of what you intend.
Why You Should Feel Patriotic Pride in Census Data
Here are three ways you should be using the amazing treasure trove of data available (all for free) from the U.S. Census Bureau to boost the rigor of your research.
Two-Point Scales Have the Highest Reliability
New research demonstrates that more points on survey response scales will decrease your measurement reliability. So always choose the simpler scale if you can.
Do Not Send Blind Survey Invitations
Sending out “blind” surveys is a bad idea even if surveys are not technically considered spam. And reputable email services will kick you out if you do it.
How’s Our Relationship? (Survey Invite Review #3)
This is the final example in our review of good and bad survey invitations. It is one of those rare survey invitations that made me want to participate!
How Amazing Are We? (Survey Invite Review #2)
Here is the second in our review of three example survey invitations. This one made me laugh (and groan), which is bad. Take a look and see if you agree.
How Bad Are We? (Survey Invite Review #1)
We review three examples of survey invitations: two that are bad and one that is excellent. Here’s the first, with an explanation of why it is mostly bad.
Getting to Yes When Response Rates Plummet
When data collection efforts hit a wall because nobody is willing participate, consider these 12 strategies that can convert many refusals into respondents.
How to Measure Shopper Strategies
Here are survey questions adaptable to any topic to measure 5 different decision-making styles (or strategies) consumers use when deciding what to buy.
Three Easy Lessons from a Bad New York Times Survey
If you want to avoid doing a silly survey like this, do three things instead: Ask unambiguous questions, find representative samples, and use effective charts.
How to Fix Your Education Bias in Surveys
Recent election polling should have you thinking much more carefully about whether your research respondents truly match the population in terms of education. Probably they don’t.
More Evidence that Online Polls Work (Really Well)
A recently published analysis of polls conducted during the 2016 presidential election shows that online polls and RDD phone polls yield identical results.
Versta Research in the News
Lincoln Financial Leverages New Research by Versta
Lincoln Financial Group has released two in-depth whitepapers (The Long-Term Care Conversation Whitepaper and Women and Long-Term Care Whitepaper) and infographics (The Long-Term Care Conversation Infographic and Women and Long-Term Care Infographic) based on its survey of consumers and financial advisors about long-term care conducted by Versta Research.
New Survey about Elder Needs for Wells Fargo
Versta Research conducted a new survey of elder Americans and adult children to explore financial needs in later years of life. The results have been featured in Forbes, Time, and on CNBC. Wells Fargo has also published a detailed whitepaper and lookbook based on the findings.
Versta Research Gives Data Visualization Tips at AMA Event in Chicago
The American Marketing Association in Chicago hosted an event that featured a how-to presentation by Joseph Hopper that highlighted Versta Research’s eight guiding principles in designing research reports and infographics for clients.
MORE VERSTA NEWSLETTERS