If You Want Expertise, Forget Artificial Intelligence
A survey platform we have been using launched a new service promising an “expert review” of our surveys before we send them out for data collection. Unfortunately, it is a gimmick. It turns out that there are no experts reviewing our surveys. An artificial intelligence robot reviews them.
It was interesting to see which best practices this robot was measuring against (survey length, question complexity, number of open-ends, mobile friendly design, etc.) The company claims to have analyzed “millions of live surveys” so that I would know “all the best practices to follow and all the pitfalls to avoid.” But I can tell you: the artificial intelligence robot served up comments that were not intelligent.
It told me, for example, that I had too many text boxes and gave me some helpful advice: “Surveys with 3 or fewer text boxes have higher completion rates.” My survey did not have more than three text boxes. The robot mistakenly identified numeric entry boxes as text boxes. And I wondered how true their advice was, anyway. Does a two-hour survey with one text box have a higher completion rate than a 4-minute survey with 4 text boxes? I doubt it.
It also advised me with a red flag warning: “Make your survey mobile friendly to increase your completion rate.” Mobile-friendly is primarily a function of the coding implemented on their end, and in fact I selected the option in their menu to make the survey fully adapt to different devices. It additionally warned me that not all of my questions had been translated into a foreign language. Why did it assume, incorrectly, that my survey was for people who read and write different languages?
My survey had lots of expert warnings, and all of them were nonsense. It reminded me of the really bad “best practices” from Google Surveys. Their robot will advise you to always randomize the order of your answer options (do not do this!). Or it will put an “insights” light bulb next to every random, meaningless correlation it discovers in your data.
The irony of this tool is that it has another component that promises to use artificial intelligence to automatically review and clean my data once the survey is finished. It will “automatically detect invalid or low quality responses coming from problematic sources such as bots.”
Hmm, can we turn that AI bot against itself? Will it please turn itself off? I want real research experts, because it does make a difference to have people who know what they are doing lending their expertise to the research process.
—Joe Hopper, Ph.D.