How to Fix a McKinsey Survey: 10 Best Practices They Should Have Followed
Last week we showed you a survey that might convince you to think twice about hiring a super high-end consulting firm for your survey research and insights. Fancy business consultants may know a lot about business, but they know little about gathering good data to answer their business questions.
So this week, instead of focusing on how bad that survey was, we reviewed it with an eye toward what they should have done to collect valid and reliable data for their purposes. Here is the list we came up with:
- For rank ordered items, limit the list of items to seven, at most. Unless the items are extremely simple, it is hard for people to sort and rank. A survey should almost never involve difficult tasks.
- Simplify language down to the fewest words possible. Eliminate strings of explanatory phrases and multiple words separated with slashes.
- Instead of rank ordering, have respondents choose their top three (or four or five). Almost certainly the aggregate ranking that results will be the same.
- Dump rank ordering unless it’s essential to the statistical analysis. I’ve never met anyone who uses rank ordered statistical testing. Almost always they calculate and report simple means.
- Proof-read survey text and content for typos, misspellings, and punctuation errors. This sounds obvious, but take another look at last week’s post. Was it outsourced to another country?
- Translate business-speak into simple terminology that respondents use and will understand. Consultants talk about “players” and “brand solutions,” but customers rarely do.
- Offer “don’t know” options (or build in skip paths) for things that respondents may have zero knowledge about.
- Do not ask respondents for data you already have. This survey asked for my name and which continent I am on, suggesting they know little about merging and appending data.
- Make instructions for each question super clear, and test those instructions on respondents in your target audience. Words like rate, rank, and score are often ambiguous.
- Avoid asking respondents to rate every brand on every dimension. The task is too tedious and repetitive, which encourages respondents to quit or satisfice.
None of these best practices are hard, but it does take experience and training, and thoughtful attention to survey design. That is not what fancy consultants do — but it is what Versta Research does!
—Joe Hopper, Ph.D.