Versta Research Newsletter

Dear Reader,

Conducting surveys with your own customers is surprisingly difficult these days. Phone options are dead, and online surveys are tough, because nobody answers or responds anymore. So what’s a researcher, strategist, or marketing professional to do? How about conducting a paper-based mail survey.

It sounds crazy, but we just did it, and it succeeded beyond what we had hoped. We got a 25% response rate.

In this newsletter, our feature article, How to Beat Online Surveys with Old-Fashioned Paper, describes the conundrum we faced in trying to conduct a customer survey for a bank here in Chicago. We designed a unique and brand new (old-fashioned) approach. This article shares all the details of what went into it and why it succeeded.

Other items of interest in this newsletter include:
We are also delighted to share with you:

… which showcases some of our recent work on a new Absence and Disability Readiness Index, as well as surveys for the Alzheimer’s Association as reported in Time and on NPR and CNN.

As always, feel free to reach out with an inquiry or with questions you may have. We would be pleased to consult with you on your next research effort.

Happy spring,

The Versta Team

How to Beat Online Surveys with Old-Fashioned Paper

Who would ever think that old-fashioned paper surveys, mailed through the U.S. post office, are a good way to conduct customer satisfaction research? Not us — until Versta Research faced a rather difficult need for research, and we could think of no better way. And to our happy surprise, it worked! Not only did it work, but it easily outperformed other typical modes of research these days. Our old-fashioned paper-based mail survey got a 25% response rate, with no additional attempts, outreach, or reminders.

Our old-fashioned paper-based mail survey got a 25% response rate, with no additional attempts, outreach, or reminders.

Here is the story of that survey—why we did it on paper, the problems we solved, the steps we took to conduct it, and why the survey succeeded beyond what we had hoped. Now, a seemingly super simple customer satisfaction survey is one of the most interesting and memorable research efforts of my career. We hope it provides some valuable insights for you about how to design and execute surveys when facing situations you may have never encountered before.

The Challenge

Our client was Devon Bank, a Chicago community bank located in an urban neighborhood that continually transforms as new waves of immigrants settle into the city, assimilate, and make way for new populations from other countries. Their retail customers speak over 30 native languages besides English. They come from countries in Eastern Europe, the Middle East, Asia, and Latin American—a true melting pot of cultures, languages, and religions from around the world.

The bank did not have good e-mail addresses or phone numbers for most, and even if they did, we worried about a research firm reaching out to them in today’s political climate. What the bank did have, of course, were postal addresses being used every month for sending account statements.

On top of that, Devon Bank is small. With only 4,000 customers in the specific market we wanted to survey, there was no opportunity to think about sampling or potential stratification. Research industry response rates are exceedingly low these days, in the neighborhood of one to three percent. Our goal for a minimum sample size is generally 300. That would mean reaching out to every customer and achieving a 7.5% response rate. Yikes.

To make that happen (if it could happen) we thought through every last detail of making the survey easy, attractive, and trustworthy. We ended up designing a research approach quite specific to this unique population and challenge.

The Solution

Paper Survey. With no e-mail addresses or phone numbers, our best option seemed to be a paper survey sent through the mail. We considered in-person surveys, or distributing surveys at bank branches, but the sporadic nature of in-person visits made that impractical. Regular U.S. mail from the bank would be expected, welcome, and probably opened, so that was our choice. We decided to make this mailing special by sending it separate from the monthly statements, along with a postage-paid return envelope. We hoped to grab attention with an excellent layout, warm invitation, and generous incentive.

Tested Design. The last time I designed and conducted a paper survey was back in graduate school while working for the university’s office of policy and planning. That was a long time ago. But there are still people who know a lot about designing, and rigorously testing and refining, excellent surveys on paper: the U.S. Census Bureau. So we printed out a copy of the American Community Survey and it became our blueprint for how to ask questions on paper. Then we gave it to our graphic design team and asked them to create the exact same look and feel for this unique mode of administration.

Sincere Invitation. The survey came with a letter directly from the bank president, and it described why the bank was doing it and how it would be valuable to customers (not to the bank). Plus it offered to pay respondents for their time. My favorite sentence was this: “An open dialogue between you and the Bank is an essential part of building and maintaining a strong relationship.” This approach is the opposite of what we often see in survey invitations, like the obnoxious one I received just last week: “Your Help Is Key to Our Success”

Good Incentive. Customers were offered $5 for completing the survey, which the bank would deposit directly into their accounts. That seemed like an attractive amount for a non-affluent population and for a simple 5-minute task. Looking back, perhaps we could have paid less (maybe $2) and easily hit our target of n=300. But we also knew there were no second chances. Unlike e-mail which is nearly instantaneous and allows for easy testing, we could not recalibrate and adjust once we launched. We all agreed that paying a good incentive was a commitment worth making up front.

Super Short. Devon bank had never surveyed its customers, and they wanted to know a lot. But filling out surveys is a burden, and with all the challenges we faced in overcoming resistance and having no second chance, we argued for a very short survey. In the end, we asked 16 questions, which took respondents roughly five minutes to answer, laid out on two pages (the inside facing pages of a 17”x11” folded piece of paper). It was short enough to keep respondents engaged, and we still got detailed data for a rich analysis of satisfaction, importance of services, age differences, banking with competitors, and much more.

Multiple Languages. Our survey documented 30 different languages used by the bank’s customers, and of course we surveyed only one-quarter of them. Translating into all these languages was not financially feasible, nor did we know exactly what all the languages might be. So bank staff estimated we could focus on these top five: English, Russian, Arabic, Spanish, and Hindi. The invitation included in the survey offered prominent call-outs with text in each language explaining the survey (and the $5 incentive) and providing instructions on how to access non-English versions.

Online Option. We had to offer an online version for one big reason: the survey was offered in five languages, and there was no way of knowing which version should be sent to whom. Nor was it practical to offer all versions by mail, because the mailing needed to be clean and inviting. So we translated and programmed the full survey online, accessible directly from the bank’s website. Each mailed survey offered a unique 5-digit survey code for access, which we intentionally did not call a “PIN” in order to avoid any confusion with other banking-related PINs.

Versta Invisibility. Sometimes highlighting the involvement of an outside firm enhances the credibility of a survey. It can offer a reassuring promise that even negative feedback is welcome and useful. For this population, however, we expected sensitivity around a third party collecting data, and therefore decided that all communications should come from, and return to, the bank. The invitation and survey were on bank letterhead, and the postage-paid return envelope was returned to the bank, as well. We did not promise respondents anonymity, but the bank agreed to let us manage the data and strip out PII in the process.

The Result

We were nervous for weeks after the mail drop. One or two came back each day. We fretted knowing that there was no way to remediate if our plan didn’t work—no easy way to send reminders, or boost the incentive, or reach out to more sample, or make more phone calls. For three weeks it seemed we might be looking at failure.

Then our contact at Devon Bank called: “We just got 500 surveys!” No, we did not believe it. As good researchers, we searched for disconfirming evidence by considering all the alternative explanations of what had happened. Perhaps the 500 surveys were returned as bad addresses (but surely we must have a good list—the bank mails statements every month). Perhaps the printer mailed back the overage and these surveys were blank (but we confirmed the printer had only a handful, and they did not ship them back).

There was no way to remediate if our plan didn’t work—no easy way to send reminders, or boost the incentive, or reach out to more sample, or make more phone calls.

Well, it was true, they really did get 500 surveys, and more than 500 additional surveys came on top of those. Our old-fashioned paper-based mail survey got a 25% response rate, with no additional attempts, outreach, or reminders.

When I shared the final research report with the bank’s executive committee, the CFO jumped in quickly to ask why they got such a great and unexpected response rate. I answered by segueing into the results of the survey: they had strong relationships with customers; their customers really like them; they had not poisoned the well by nagging with a survey after every transaction.

I should have taken some credit, as well. We knew the challenges we faced. We thought through every approach possible and addressed every point of resistance we could anticipate. We brought up-to-date knowledge and expertise, and applied it to some old-fashioned techniques in novel ways that were truly unique.

Yes, old-fashioned paper surveys are still a viable option. In some cases they may be the only option. If you know what you are doing, you can make them as successful as online surveys—and maybe even go 22% better.

Stories from the Versta Blog

Here are several recent posts from the Versta Research Blog. Click on any headline to read more.

A Conference with Real Classes You Can Take

If you want truly substantive short courses to advance your knowledge of market research, you can choose from eight excellent options at an upcoming conference in May.

Secrets of PowerPoint for Beautiful Research Reports

This article describes some “hidden” PowerPoint tricks and shortcut keys we rely on all the time whenever we build research reports with charts and graphs.

Let Your Respondents Mess Up

Survey platforms now come many technological “solutions” that can prohibit respondents from giving inconsistent data. Here’s why you should not use them.

Testing the Impact of Low Response Rates

A research study compared results from successive points in time until achieving a near-perfect census. Results based on low response rates were just as good.

The Quintessential Robot Survey

Here is a story about a survey that seemed to be designed by robots, for robots, and that was even administered by robots. The result (I hope) was failure.

Use These Dates to Define Millennials

If you are constantly looking up those darn birth years and age bands defining Gen Z, Millennials, Gen X, and Boomers, use this reference chart instead.

Better Marketing Insights from Farmers (Not Egg-Heads)

A central tenet of great market research is to “go get data at the source” because you will get better insights than clever data models built from afar.

100+ Data Sources for Market Research

The U.S. government has 128 agencies collecting valuable and rigorous data that business and researchers like you can use for free. Start here to find data!

Decrease Survey Error with Eye-Tracking Diagnostics

If you are about to invest a lot of money in a big survey that really matters, here is an excellent way to test whether it will make sense for respondents.

This DIY Tool Promises Statistically Significant Research

If somebody promises to deliver statistically significant sample sizes, research, or findings, run away fast. They do not know what they are talking about.

Versta Research Just Turned 10

Our 10th year anniversary got us looking for great sources of data to explore the 10-year survival rates of new businesses. Here’s the story our data tell.

Versta Research in the News

New Index for HR Management Launched

Versta Research was commissioned by The Standard to develop its Absence and Disability Readiness Index, launched in March 2019, as reported in Business News Daily. A detailed look at the index and research findings are available in a full whitepaper and infographic.

Research on Cognitive Assessments for the Alzheimer’s Association

Versta Research was hired to conduct surveys of physicians and seniors about their experiences with cognitive assessments for the 2019 Alzheimer’s Disease Facts and Figures report. Findings were reported by Time, the Associated Press, NPR, CNN, and other news outlets.

Narcolepsy Findings at Annual Neurology Conference

Results from our research with patients and physicians about the burdens of narcolepsy for Harmony Biosciences will be presented in May at the 71st annual meeting of the American Academy of Neurology.

Celebrating Our 10-Year Anniversary

Versta Research celebrated 10 years of Turning Data Into StoriesTM highlighting its recent work in healthcare, consumer products, financial services, employee benefits, and information technology.

MORE VERSTA NEWSLETTERS