Marketing and Research Consulting for a Brave New World
Subscribe via RSS

I am one of 5 market research bloggers who are writing on a common topic.  You’ll also be hearing from Annie Pettit (organizer), Josh Mendelsohn, Bernie Malinoff (Element 54) and Brandon Bertelsen.

This month, we are responding to Bernie’ Malinoff’s research on research that shows that the online research interface per se can have a significant impact on the answers.  For example, the use of sliders, drag and drop, etc. that are enabled by flash, for example, vs. text-based questions will produce different results for the same information and therefore might be a mixed blessing (better respondent experience but less consistent data). I believe that going from text-based to visual interface can influence results, although I imagine that this is less of a factor for constant sum or choice questions.

The larger issue is online research data quality…accuracy and consistency.  In particular, consistency is important as most brand research metrics from survey research are based on attitudinal measures that are intended to be compared to a normative database or trended over time.  Some have latched onto representativeness as the main lynchpin of data quality but the following graphic shows that there many equally important influencers that need to be managed.


If any of these factors are not matched from one wave of research to another, there is a risk that data comparability will be lost. A reason for this is that people are not effortlessly accessing memory and feelings.  They are reconstructing them for the purpose of answering a survey.  As they go through cognitive processes, things happen. For example, we know that longevity and survey conditioning matters.  In other words, if you send someone the same type of survey over and over again, you are likely to get conditioning effects that produce different answers. Note that this graphic has “Survey instrument” as a variable, which includes question order, length, scales, and must include, as Bernie points out, the graphic interface of the question.

Internet research has some huge advantages.  It is not only faster and less expensive; it offers an environment that is more native to our digital, interconnected world.  We must not shy away from finding the best way of harnessing the more realistic environment that internet research can offer.

Here are a few advantages that the internet offers for research:

  • Interconnectedness.  Life has become an open-book exam where people can connect with anyone they want or search for information to research a potential purchase.  This aspect of real life consumerism can be mimicked in internet research, especially communities or prediction market approaches.
  • Immersive environments. The behavioral economist Prof. George Loewenstein from Carnegie –Mellon University cautions when people are in “cold states” they can’t predict what choices they will make in “hot states”.  The internet offers the ability to create immersive and virtual shopping environments that will do a better job of getting a respondent into the right mindset.  Marketing science Prof. Glen Urban created the “information accelerator” which is used extensively for automotive research.
  • Sight, sound, and motion.  For example, I remember when we used to use telephone research for ad tracking; we would ask a respondent in words if they remember seeing a commercial that showed XYZ.  Now we can debrand a video and stream it online before asking if they remember seeing the commercial.  Much better.
  • Turnkey collection.  For example, some copy testing firms are automatically testing each and every TV commercial in a category using digital technology.

We need to be both cognizant of the effect of survey interface and progressive about testing the immersive and visual capabilities of internet research.  I’ll advise The ARF online research quality council to add survey interface elements to the ARF Quality Enhancement Process; it should be part of the structured conversation between buyer and seller.

Tags: , , ,

Comments

3 Responses to “Getting the most out of online research”

  1. Great post Joel.

    I wonder how much the midset difference between online personalities impacts data quality. There is a distinct difference in the `online research participant` and the status quo, and it begs the question; do we ever really get a true read on effectiveness when only a portion of audiences ever respond to incentives to participate?

    What is representative sample when everyone in the sample is a ‘survey taker’? It’s true that online ‘text based’ surveys are often dry, scientific and boring – visually engaging surveys that add an interactive element to the participant experience might increase the potential of adding new blood to the respondent base, but ultimately aren’t we really still pulling the same ‘type’ of audience through the gate?

    In mobile I can see the visual aspect of the survey interface being not only an aspect of a successful buyer/seller conversation, but a crucial one.

    On the web, consumers are well attuned to the multithreaded nature of ‘pages’, being led here and there by navigation and clicks, but on mobile the experience is significantly different; users tend to be more single task focused and dislike being yanked from one action to another – especially when the move is from ‘app to browser’.

    Moreover, the browser ‘form’ experience in mobile is far from pleasant for consumers to navigate through, compounding the negative effect.

    By integrating ‘same context’ immersion and visually engaging mechanics into the survey process within mobile apps, mobile users can be incented and surveyed without interruption of the activity they are engaged in. They can, in the context of an app, also be provided with incentives to respond that correlate directly to the activity they are engaged with – an example being using incentives of microcurrency within a mobile social game or app.

    In addition, the novelty of the mobile app experience and a well integrated interactive survey has a decent shot at pulling in respondents from outside the normal ‘survey taker’ mentality, increasing overall response and improving overall representativity. With the use of mobile interface elements that make responding to questions easier and more ‘interesting’, this effect is further improved.

    Loop Analytics is piloting a new ability to intercept and incent mobile users to engage in brand lift studies within the mobile app context with promising initial response, but it remains to be seen how this will play into the overall landscape of sample sizes, response rates and data quality/consistency.

  2. […] Getting the most out of online research | CRO-ing About Research said on 18-12-2009 […]

  3. […] Joel Rubinson of the ARF: https://blog.joelrubinson.net/2009/12/getting-the-most-out-of-online-research/ Tagged as: market research, new media, survey design Leave a comment Comments (0) Trackbacks […]