By Kayte Hamilton & Chuck Rodriguez
In most research settings there’s been an on-going battle between which methodology is best suited for the job; do we need something statistically valid in a quantitative environment or are we looking for deep context through a qualitative session?
Many researchers often feel they must choose one route over the other. On top of that; in the last few years there’s been an explosion of new tools that blur the line between what traditional methods mean—things like eye-tracking and neuroscience tools, automated interviews and data tracking. At InsightsNow, we find the highest value to our insight mining comes from our ability to recognize the benefits of certain resources and cherry-pick a custom solution. An agile approach to research design also allows exploration of the data collected in a best use case scenario.
Disruptive Innovation and Hybrid Research
Disruptive innovation is an interesting concept. To many; innovation means something technology or tech-driven (simply run a search for “innovation” in Google imagesand you’ll see what we mean). However, disruptive innovation is a more basic idea of introducing something new to your processes and that’s exactly what we’re doing here with our hybrid research designs. The tools and resources aren’t new (so to speak), but when they are combined it makes for a unique research solution—solutions driven by the need to deeply uncover consumer behaviors and motivations without adding time to the fieldwork.
In August, we presented a webinar (you can find the recording here) that explored this hybrid quantitative and qualitative research design through the lens of two case studies. While about 70% of the webinar audience admitted to already using hybrid designs, 100% of the audience felt they learned something new from our presentation.
Case Study #1: Message Testing in the Consumer Technology Space:
The first of the case studies we referenced in our webinar focused on a study based entirely on messaging communication. In order to optimize the time our qualitative moderator had with participants across sessions, we uniquely leveraged our initial engagement with these folks—by activating something more powerful than the often used high-level homework primer. After mulling over a bunch of approaches, we decided to go with InsightsNow’s Implicit Test to capture reactions toward the messaging.
The benefit of going this route was obtaining those System 1 and System 2 reactions before participants even stepped into a room with us. These responses were provided in a vacuum, so to speak, which not only afforded us insight related to the messaging but also into our participants’ minds. Learnings could be used for more effective probing to identify key presumptions and gray areas that contributed to those initial reactions toward the messaging.
Leveraging a quantitative resource before our sessions allowed us to:
- Interpret the measured response times to dig for disruptive components in each message, “What slowed you down?”
- Create a hierarchy of performance rooted in behavior and the deep dive discussion in the room.
Now, one of the questions that was posed during this project creation included the strategy behind the idea:
- If Only Quant = We would have known WHAT metrics, without the why or context in each message.
- If Only Qual = It would have been difficult to establish a definitive winning proposition; which was one of the project objectives.
- If Multi-Phase = This would have been costly and time to actionable results would have been considerably longer. We utilized the focus group participants rather than two distinct sets of recruited consumers.
Case Study #2: Market Understanding & Consumer Needs in the Health Care Industry:
For the second case study example, we pulled from another industry entirely, designed to shed some light on consumers within the health care space. Our client was interested in understanding perceptions of health care in their region, both more broadly and among a handful of giants that occupied the space. One of the underlying objectives was to also obtain direction for use in their messaging. As it turns out, the still very current health crisis we’re all so familiar with was having a considerable impact on preferentiality and our client’s brand was seeing a decidedly negative impact. They needed insight to clearly define new messaging, regain lost loyalty and reinforce their mission.
In this research design, we again used quantitative to guide our qualitative discussions…in more ways than one:
- Helped determine key indicators used to operationalize TWO segments of health care consumers (those who favored our client and those who were apathetic).
- Provided detail used to determine the ‘short-list’ of services and amenities for exploration among well-curated, target segments.
- Provided a market-level baseline read on messaging reception for comparison to both segments to tease out key differences and similarities.
While this study began as quantitative first; the bulk of the execution was a five-day online board/short-term community. Throughout the week we rotated quantitative tasks within the qualitative discussion allowing us to deep dive into the aggregate results rather than trying to discern individual results all week. This helped ground the discussion and got the consumers thinking in more detail as they caught on to the design.
To focus the qualitative discussion, we used surveys to quantify large “bucket-sorting” tasks for preferred brands, brand attributes, and InsightsNow Implicit Test again for brand messaging. Those results were tied into group and individual assignments in the discussion by highlighting keep themes in the aggregate. By moving any short-response type questions in the surveys, we could tie back their larger discussions to their customized survey responses (if needed).
Other Hybrid Research Design Ideas:
While we’ve been using our Implicit Test on a wide range of research applications; our executions aren’t limiting to just this resource.
- Integrate Social Intelligence analysis before you develop any discussion guides or assignments. “What do you already know” >> “What do you want to know MORE of?”
- Utilize Passive Metering to find out what consumers do, not what they say they do.
- Complex rankings? Use a Tournament or MaxDiff survey design before or after sessions to better organize the data. Can also use surveys in real-time with digital participant packets.
When deciding if a hybrid research design is best for your business objectives; we challenge you to consider these types of questions. We’re also available to help you brainstorm your next successful execution!
- What are you running sequentially that you can combine for more agility?
- What more information do you find yourself asking after an execution?
- What do you wish you could learn, yet haven’t been able to decipher?
- In what situations are your answers too generic?
- How often do you find it tough to make decisions after a research project?
- Are you missing a behavior or subconscious cue you need to understand?