Versta Research Newsletter

Dear Reader,

One of the next big challenges for market research is to make sure that all our cool technologies actually work for all the people who need to use them. That includes people with various disabilities who use assistive technologies so that they, too, can participate in online surveys.

If you are a consumer-facing company deploying online surveys of your customers, developing ADA accessible surveys will likely be on your agenda in the next couple of years. At Versta Research we have been developing capabilities for accessible surveys over the past year, and in this newsletter we share some of what we learned to help you along the path.

Other items of interest in this newsletter include:
We are also delighted to share with you:

This section includes news about our upcoming presentation about strategies for quick and nimble research at the annual LIMRA Marketing and Research Conference.

One other piece of news: Will Heriford officially joins Versta Research as a research analyst in May. Will recently earned an M.A. in sociology at DePaul University with a thesis involving ethnographic research on graffiti in Chicago. His work has focused on identity, status, and social boundaries, all of which pertain to multiple areas of consumer and B2B research we do for clients. We are delighted to have him joining us.

As always, feel free to reach out with an inquiry or with questions you may have. We would be pleased to consult with you on your next research effort.

Happy Spring,

The Versta Team

How to Make a Survey ADA Accessible

How to Make a Survey ADA Accessible

It’s oddly gratifying when all those disruptive innovations and shiny technologies, and all that fast-paced we’re-changing-the-world hype in market research screeches to a halt because the industry left out a crucial piece: People.

The Americans with Disabilities Act is giving our industry a run for its money and for good reasons. Most online surveys—a crucial forum through which customers now provide feedback to companies and the organizations that serve them—don’t work so well for people with disabilities.

With the need for survey accessibility growing and with new requests from clients, we set out to develop robust capabilities to conduct ADA-conforming research. It took six months of intense study and development, plus a decent investment in customized programming. We have tested, revised, and refined all of our research protocols, working with detailed input and testing from an accessibility consulting firm.

What does it take to achieve robust capabilities for accessible surveys? The same skills it takes to do great research.

Now we can do it. As of January 2016 we are achieving high scores via third-party testing of our surveys. We have a customized process and internal guidelines that allow us to design and implement accessible surveys efficiently and economically.

What does it take to achieve robust capabilities for accessible surveys? The same skills it takes to do great research: Eagerness to learn and apply new information. Attention to detail. A willingness to go beyond traditional training.

Here are some of the most important highlights of what we learned in designing accessible surveys, and what you will need to know if you’re headed down this path, too.

Know Your Guidelines

The best place to start is with the Web Content Accessibility Guidelines (WCAG) 2.0, published by the World Wide Web Consortium, which is the group that establishes and maintains international standards governing the Internet. WCAG sets “guidelines that specify how to make content accessible, primarily for people with disabilities—but also for all user agents, including highly limited devices, such as mobile phones.” It is an ISO standard.

These guidelines may seem boring, but they are essential. They lay out specific, testable principles for making online content accessible, regardless of the technology or programming platform being used. The guidelines focus on four areas: (1) Is the content perceivable? Can a blind person, for example, navigate through all content and instructions with an audio screen reader? Can a deaf person easily read captions for video or audio content? (2) Is the content operable with alternative input devices, like keyboards or sip-and-puff technologies? (3) Is it understandable, with clear instructions and easy methods for correcting mistakes? (4) Is it robust for use with all types of assistive devices?

Know Your Question Types

And that brings us to surveys. Standard survey questions—those plain old-fashioned radio buttons or select boxes—are (relatively) easy to design for accessibility. Fancy interactive questions are not. So in the very beginning stages of writing a questionnaire, think carefully within WCAG limits of how a question can be laid out. For now, drag and drop functionality is out. That means no card sorts or slider scales. Numeric entry boxes are good, but auto-sum functionality is difficult. Interactive rank sorts are out. Image maps are out. Timed questions are out.

At Versta Research, we work with six question types for accessible surveys: drop down, single select, multiple select, grids, numeric box, and text box. And you know what? They work just fine. They are simple, fast, easy to use, and exactly what most people expect to see. Moreover, there is emerging research to suggest that survey respondents prefer simple survey formats over dynamic ones.

Know Your Platform

Know your platform, and know your programmers, too. If you outsource survey programming, or if you use a DIY platform, you will need to know far more about your survey platform than you have in the past. We were excited last year when the platform we use announced a new version roll-out as being ADA accessible. They believed that a few coding and tagging techniques for screen readers would make surveys accessible. Well, our first attempt failed miserably in testing.

If you outsource survey programming, or if you use a DIY platform, you will need to know far more about your survey platform than you have in the past.

But over the course of four months, we worked closely with them to fix it. Armed with data, results of failed tests, and coding instructions from a third-party testing firm, we gained access to their top-tier programmers, and we paid for customized templates that gave us much more robust accessibility. Given the complexity of the task and the trade-offs involved (focusing more on simple question types instead of dynamic interactive ones) it will take years before most survey platforms will achieve these levels of accessibility. In the meantime, it is up to you to step in and make that happen.

Know Your HTML

And even if you know your platform and your programmers, you probably need to know a bit of HTML as well. We are fortunate to have staff who relished the challenge. Online surveys are input forms, but they are generated in a more complex way than simple HTML forms. The challenge is to make survey forms work correctly with assistive technologies. And this, it turns out, is far more difficult than current protocols for web accessibility suggest.

Pardon this brief digression into coding: The challenge, we discovered, is to specify all inputs and labels by which respondents understand and record their answers to survey questions. Form field labels and constraints must be programmatically associated with the inputs. It’s not enough that they appear “on screen” near the input box. Their relationship “in the background” must be clearly established, like links in a chain, from question, to instruction, to choice label, to input button. Even with a solid platform and newly customized templates for accessible surveys, you will need to check, test, and sometimes modify the HTML coding to make those chains of labelling work.

Know How to Test

Ultimately what matters for accessibility is whether people with physical limitations can fully participate in a survey if they want to. Is there enough information on the screen and behind the scenes (and is all that information consistent, logical, and thorough) to allow alternative devices like modified

What matters for accessibility is whether people with physical limitations can fully participate in a survey if they want to.

keyboards or screen readers to navigate the survey seamlessly? Besides working with a third-party testing firm, there is a great deal you can test on your own with free tools. For example:

  • Firefox and Chrome have web inspector tools that will display the HTML code of your survey pages
  • W3C Markup Validation Service will test the basics of each survey page to ensure you have “robustly compatible code”
  • NVDA (NonVisual Desktop Access) is a free screen reader you can use to test how a blind or vision impaired person would navigate through your survey
  • Many browsers and desktop applications offer color contrast checkers, re-sizable text, and high contrast mode for WCAG compliance testing tasks
  • Keyboard navigation—that is, answering your survey without ever touching your mouse—will also help identify gaps and errors in the code behind what you see on a screen

Accessibility All Around

New technologies have put market research tools within the grasp of more organizations and more professionals than ever before. Sampling, data collection, statistical analysis, data visualization—all of these can now be done more cost effectively and by much smaller research groups.

How do you make a survey ADA accessible?
To make a survey ADA accessible, insert detailed coding into your HTML so that assistive devices can navigate correctly, avoid interactive question-types, and test all functionality for conformance to WCAG 2.0 guidelines.

Market research tools have become far more accessible to those of us building our professional lives around insights and analysis. Now it is time to bring another level of accessibility—the ADA type—full circle back to the people we rely upon for data.

Versta Research is here to help you with all of it. Whether it be design, execution, analysis, or reporting, we can help you make your research more accessible in every sense of the term for the audiences you need to reach and inspire.

Stories from the Versta Blog

Here are several recent posts from the Versta Research Blog. Click on any headline to read more.

Hard-Nosed Managers Use Research More

Empathic managers who adopt the perspective of their customers are worse at predicting what customers want, and they use market research less effectively.

Six Things to Know about P-Values

The American Statistical Association is offering new guidelines (and cautions) about how to explore your data with tests of statistical significance.

Don’t Know Is Not an Option

Research shows that “don’t know” responses usually reflect laziness, not uncertainty. Here are guidelines on when to offer don’t-know options on surveys.

Research Tip: Watch Your Fieldwork

Data collection is a painstaking process that you should monitor every single day with a fieldwork watch list like this, as featured in our recent Versta video.

New Resource for Tons of Great Data

This website aims to make the treasure trove of public data available from the U.S. government (including Census and BLS data) easy to retrieve and visualize.

Convincing People to Take Your Survey

Researchers now find themselves in operational roles as our problems shift from methods and statistics to the difficult task of getting people to respond to surveys.

Why I Love Political Polls

No area of market research gets as rigorously validated as political polling. We learn tons when the polls are wrong, as this industry report demonstrates.

Are You Sick of Drag-and-Drop Surveys?

Interactive formats for surveys, like card sorts and slider scales, boost user engagement at first. But respondents quickly tire of it, and soon find it annoying.

Five Summer Classes to Up Your Research Game

Here are five short courses (just a few intensive days) we recommend for business professionals who need back-to-basics training in survey research methods.

Strong Brands Have Weak Drivers

Having a strong brand increases how many people will validly “straight-line” a customer satisfaction survey, making it harder to discern any key drivers.

Don’t Stop Your Straight-Liners

Allowing respondents to straight-line a survey is valuable because (1) straight-lining is often valid, and (2) when not valid, it helps flag bad data.

9 Is Not Fine with NPS Scores

Customer satisfaction surveys take a troubling turn when customers are badgered and begged to give scores of 10. Whatever happened to real research?

Versta Research in the News

Survey of Lenders on Consumer Credit Scores

Versta Research conducted an industry study about the use of alternative data among lenders who rely on consumer credit data. A summary of findings, an infographic, and a full report are available from TransUnion.

Versta Research to Speak at Upcoming LIMRA Conference

We will be at annual LIMRA Marketing and Research Conference in Orlando with an invited presentation on Research Be Nimble, Research Be Quick!