December 2012
Dear Reader,
Whether age begets wisdom is a question neither social science nor market research can answer. But age matters for nearly everything we study; so, rare and foolish is the survey that does not ask about it.
How one asks about age, however, is surprisingly thorny. There are multiple sources of error to contend with and important trade-offs to make in asking one way over another.
After reading Versta Research’s Winter 2012 Newsletter, we hope you agree that whether or not age begets wisdom, there is much wisdom to be had from a spirited debate regarding how to ask about age on a survey.
Other items of interest in this newsletter include:
- 5 Secrets of Innovation Success
- Google Beats Gallup in Recent Polls
- Using Mekko Charts to Show Market Share
- What the Polls Show: Research Works!
- Credibility Intervals Are the New Margins of Error
- Got Too Many Elephants in Your Focus Group?
- The “Push-for-Story” Approach to Research
- Taking Google Surveys for a Test Drive
- Five Danger Signs When Fielding a Survey
- Dilbert’s Actionable Deliverables
- Census Bureau to Change Race/Ethnicity Measure
We also highlight some of Versta Research’s recent surveys for the American Chronic Pain Association and Pfizer.
Happy Holidays from Versta Research!
Wisdom in an Age-Old Question
How many statisticians does it take to change a light bulb? One, plus or minus three. And how many survey methodologists does it take to determine a person’s age? Well, no punch line here, but it takes many more than you might think. Recently, one member of the AAPOR-net discussion group, an online discussion forum hosted by the American Association for Public Opinion Research, posted a query about the best way to ask for a person’s age on a survey. The question generated more discussion than any other topic over the last four years—far more than complex and controversial topics like probability sampling, online surveys, new technologies, advanced statistical methods, and so on.
Along with gender, age is probably the most frequently asked survey question for all types of marketing and social research and across all industries. It seems that by now we ought to know how to ask about it, right? But there is a surprising degree of variation and disagreement among research professionals, and for good reasons.
Four Sources of Error
Our best practice at Versta Research has always been to ask for year born. But the extensive online discussion reminded us that any approach should depend, in part, upon which of four important sources of error need to be minimized for the study being designed. These four sources of error are: age heaping, non-response, erroneous data, and imprecise data.
1. Age heaping results when people give an approximate age rather than a specific age. Because so many of us round to magic numbers like “0” and “5” data will often show more people who are 40, 45, 50, and 55, compared to people who are 41, 47, or 52.
Why would anyone give an approximate age rather than a specific age? Lots of reasons. Some people, especially older people, do not know their age without doing a mental calculation. Some respondents prefer not to provide an exact age, as it could more easily compromise privacy. And even if you ask about year of birth rather than age, heaping is still a problem in cultures where people can only approximate the year in which they were born, or where, as one AAPOR colleague noted, “certain calendar years are auspicious or ominous.”
2. Age non-response occurs when people refuse to answer a specific question about age even though they are willing to answer most other survey questions.
If mental calculation is required, respondents may feel the burden is too great. Privacy and survey anonymity are important concerns as well, especially if a study is focused on a small population. Why? Because specific ages, if combined with other data such as gender, marital status, occupation, etc. can make it easy to identify individuals even if no personal identifiers are asked for.
3. Erroneous age data can stem from several sources, including both respondent error and survey design ambiguities.
For online surveys, drop-down menus are especially prone to error, and the errors have serious implications when respondents are offered numeric ranges rather than specific numbers. In addition, some respondents make calculation errors, and if a person has privacy concerns with no option to decline or skip the question, he or she may enter false data to get past it.
Respondents may also misinterpret a question about age. When asking year born, for example, some will interpret this as a request for age, and answer “47.” Or maybe they did not misinterpret, but instead left off the first two digits of 1947. It is impossible to know without following up or, if the survey is online, without constraining the possible answers within specific numeric ranges (easy to do, but rarely done!). Oddly enough, some respondents will even misread “In what year were you born?” as a request to provide the city, state, or country in which they were born.
4. Imprecise age data is the rule rather than the exception in nearly all surveys even if we manage to avoid other sources of error. The reason? Few surveys ask for an exact date of birth.
But not knowing an exact date means that some data will be one year off when classifying respondents into age groups or birth cohorts. For example, if we ask for year of birth and then calculate age, the calculation will be one year off for respondents whose birthday comes later in the year. Likewise if we ask for age and then derive year born, there are two possible years in which to classify each respondent because whether they were born earlier or later in the calendar year is unknown.
Options for Asking Age
Alas, there is no way to minimize all four sources of error at once. It is essential, therefore, to know during the design phase how the data are going to be used and analyzed (see Versta Research’s white paper, The Art of Asking Questions). A tracking study, for example, may require nothing but an approximate age grouping, whereas a cohort or generational analyses will likely require an exact year born. In short, each of these common ways of asking age will involve trade-offs:
“How old are you?” or “What is your age?”
Strengths
- Provides appropriate detail for most studies
- Aligns with our usual interest in numeric age (vs. generation or birth cohort)
Weaknesses
- A direct question may offend some
- Specific ages may raise privacy concerns, especially with small studies
- Higher item non-response than other ways of asking
- Subject to respondent error and age heaping
“In what year were you born?”
Strengths
- Provides appropriate detail for most studies
- Easier to answer and more accurate than providing age
- Feels less intrusive to respondents who are sensitive about age
- Directly identifies birth cohort
Weaknesses
- Age must be calculated, which will be imprecise for some
- Higher item non-response than asking for a range of ages
- Subject to age heaping in some cultures
- Requires careful instructions and programming to avoid data entry errors
“What is your birth date?”
Strengths
- Provides detailed data for all types of analysis
- Provides exact numbers to accurately derive age and birth cohort
Weaknesses
- High burden on respondents
- Significant concerns about privacy and data protection
- Highest item non-response
- Requires careful instructions and programming to avoid data entry errors
“What age range are you in?”
Strengths
- Respondents can answer quickly and accurately
- Least intrusive of all age questions
- Highest item response rates
Weaknesses
- Less detailed information limits data analysis
- Assumes knowledge beforehand of optimal age categories
- Inaccurate response will have bigger impact on data analysis
- Drop down menus will cause errors
Knowing Why Age Matters
Which age question should you choose? It all depends on details such as: (a) why the research is being done (b) how and why age data will be important (c) how age data will be used, analyzed, and reported (d) who will be responding to the survey, (e) whether the survey is being fielded online, by phone, in person, or on paper . . . the list goes on. The wisdom in the recent debate among survey methodologists is that how one asks depends on myriad aspects of the research that too many overlook.
Few questions seem simpler than asking “How old are you?” But anticipating the right approach and then untangling the errors, ambiguities, and missing data in the responses that come back can be amazingly complex. A wise researcher is one who poses hard questions about the research itself before feeling satisfied with even the simplest survey question.
Stories from the Versta Blog
Here are several recent posts from the Versta Research Blog. Click on any headline to read more.
5 Secrets of Innovation Success
A recent study in the Journal of Marketing shows that involving customers is key to innovation success, and that research among customers plays a crucial role.
Google Beats Gallup in Recent Polls
Polls for the presidential election give overwhelming evidence that online surveys work. In fact, they now out-perform the most rigorous types of telephone surveys.
Using Mekko Charts to Show Market Share
Mekko charts are powerful and intuitive graphics that show market share, and they are easily generated with R software. Here are some examples of how to use them.
What the Polls Show: Research Works!
The research industry won a decisive victory this election season by accurately predicting the outcome even in the face of new challenges and shifting methods.
Credibility Intervals Are the New Margins of Error
This article describes the concept of a Credibility Interval, one of the newest trends in market research, driven by a broader use of Bayesian statistics.
Got Too Many Elephants in Your Focus Group?
One of the trickiest issues in research is knowing exactly who to include and how to weight their responses. A problem faced by the Gallup poll illustrates.
The “Push-for-Story” Approach to Research
Some (like Google) believe that survey tools & technology can deliver insights at the click of a button. But their “insights” are merely data in search of a story.
Taking Google Surveys for a Test Drive
Six months ago Google launched a new product called Google Consumer Surveys. Here is a look at what it offers and how well it is likely to perform.
Five Danger Signs When Fielding a Survey
Good data collection means monitoring incoming data every day during fieldwork. Here is an example of a daily field report that will help flag danger before it’s too late.
Dilbert’s Actionable Deliverables
Do you ever feel like you are in a world of non-stop idiot-speak with terms like “actionable insights?” We do, and so does Dilbert. But there is a way out.
Census Bureau to Change Race/Ethnicity Measure
After splitting Hispanic off from its question about race fifteen years ago, the Census Bureau will likely revise it again based on new findings from research.
Versta Research in the News
Versta Survey Helps Launch Diabetes Campaign
Results from a new survey conducted by Versta Research were published this month in the American Chronic Pain Association’s (ACPA) quarterly publication, Chronicle, as part of a new educational initiative to help patients and physicians talk about diabetic nerve pain symptoms.
Recently Published
PR Tactics Features Versta Perspective on Surveys
Public Relations Tactics has published its November 2012 issue with a full page article from Versta Research. The article offers suggestions for planning and pitching survey research for news organizations such as the Associated Press and The New York Times.
MORE VERSTA NEWSLETTERS