Versta Research Blog

Versta Research Blog

About Versta

Versta Research is a marketing research and public opinion polling firm that helps you answer critical questions with customized research and analytical expertise.

Explore Versta

Versta Research Blog

Explore industry trends, research methods, and tips for your own research projects in the Versta Research Blog. All opinions are our own, and some may change over time.

First time reader? Check out the Best of the Blog for the most popular posts from almost 10 years of blogging. We’re glad you’re here.

Thinking Neuroscience? Just Use Twitter.

Thinking Neuroscience? Just Use Twitter.

I resist the idea of using Twitter for research because I do not know anyone beyond journalists, PR people, and marketing folks who actually use Twitter. Well, that’s a terrible reason to dismiss a potentially rich source of data, especially because one premise of survey research is that small parts can represent the whole.  My…

Read the rest of this entry
Getting Your Ad to Light Up My Brain

Getting Your Ad to Light Up My Brain

Having just returned from giving a presentation at the Advertising Research Foundation’s annual conference, Re:Think 2015, I learned that ARF’s “Ground Truth #2” is that brands are built in the brain.  As such, the ARF has invested a good deal of time and money over the last decade exploring ways that neuroscience can be applied…

Read the rest of this entry
The Best Place to Learn All Things Data

The Best Place to Learn All Things Data

Every year I peruse the listings of summer college training courses on research methods. Part of it is nostalgia—I loved school. But I’m also looking for courses that will keep me and our Versta employees at the forefront of new knowledge and research techniques. (I didn’t learn R in graduate school, so yes, I took…

Read the rest of this entry
UGH to Sugging and Frugging

UGH to Sugging and Frugging

One of the cardinal sins of market research is to misrepresent the purpose of research and how it will be used.  The colorful terms used for two of the most common forms for misrepresentation are SUGGING and FRUGGING. The first of these stands for Selling Under the Guise of research.  The nefarious “research” firm calls…

Read the rest of this entry
How Would Hemingway Present Your Research?

How Would Hemingway Present Your Research?

I’m working on a presentation with a research colleague at Wells Fargo for the upcoming ARF (Advertising Research Foundation) Re:Think 2015 conference.  The challenge?  Getting it streamlined and condensed so that we can deliver the whole thing within a strict timeframe.  All the research, all the insights, all the business outcomes from a huge national…

Read the rest of this entry
Stat Testing: A (Too) Easy Crutch

Stat Testing: A (Too) Easy Crutch

Those of us who do a lot of survey research spend tons of time poring over statistics and reading data tables.  And no matter what all the latest clever tools promise, there is no shortcut to reading page after page after page of data or tables or charts, and discerning the patterns or lack thereof.…

Read the rest of this entry
High Response Rates Hurt Data Quality

High Response Rates Hurt Data Quality

An irony of survey researchers’ obsession with high response rates is that higher response rates often hurt data quality.  How can that be?  It happens because aggressive recruiting boosts the participation of people who provide less reliable information.  Two academic articles published in a special issue of Public Opinion Quarterly on “total survey error” nicely…

Read the rest of this entry
A Statistically Significant Cartoon

A Statistically Significant Cartoon

In Turning Data into Stories we pointed out that numbers have no inherent meaning.  There are no essentially big or small or significant numbers, for example—they only become big or small or significant within the context of specific research questions and a range of possible answers. The same is true for p-values by which we…

Read the rest of this entry
Versta Research Post

Testing Your Data for Illusions

Here’s a useful way to think about statistical significance.  When looking at your data, what’s the probability that it looks like something is there, when in fact nothing is there.  Randomness in data (because of sampling) often causes illusions.  So testing for significance is all about measuring whether the patterns we see in our data…

Read the rest of this entry