Ten Tips for Utilizing Video Content in Online Market Research
By Dale Watts, Market Catalyst
A convergence of available technology, infrastructure, and mainstream communication behavior is making the use of video content possible for a wider range of online qualitative research. As more online qualitative researchers are asking for video content from participants, best practices are starting to emerge from real-world experience. The 10 tips below are drawn from my experience with itracks software and conversations with our early-adopter research clients. The tips have been grouped into three areas:
- Quality of Recruitment/ Instructions
- Probing Strategies
- Post-Field Processing of Video Footage
Quality of Recruitment/Instructions
With Facebook, YouTube and smartphone usage on the rise, the general public is becoming more capable and comfortable with recording and posting video content. Some people, however; find it challenging and self-assessment of technical skills related to video can be inaccurate.
Tip 1: If recording and posting videos is a critical part of the research, you may want participants to upload a video as part of the screening process. Perhaps they could do a short video of them introducing themselves and agreeing to participate. It’s a great way of checking to see that microphones, webcams or mobile devices are set up and working well.
Tip 2: It is also a useful practice to include screening questions that are not just scaled ratings of comfort or skill level, but inquire about the device that they have used and specifically ask how recently they have posted a video.
Tip 3: Ensure that participants have access to technical support during the study to help smooth the technical process for participants who are having trouble. This type of access is available 24/7 with itracks’ platforms. Whether this technical contact is your internal staff or with software as a service provider like itracks, it is an important feature that enables the moderator to focus on the discussion versus technical details.
Because of the wide range of video-related skill sets and widely-varying interpretations of what words like “video” mean to people, the language used in instructions needs to be very specific. “Take a video of yourself buying some apples in a grocery store” may elicit video uploads ranging from a “movie” with titles/transitions/special effects, including entering the store through to the cashier, to a 5-second shot of someone picking up an apple.
Tip 4: Specify exactly what you need included in the video. For example, where you want the video to start and end for an in-store purchase and general expectations as to minimum length.
Tip 5: Think about the practical logistics of taking the video. For example, if you are asking a person to not give you a view of what they are looking at, but instead a view of themselves tasting or using a product, you may want to specifically instruct them to have someone else take the video. Otherwise, they may try to video themselves without being able to view what the camera sees and the quality of the video may suffer.
Many qualitative researchers, especially those from a facilities-based focus group moderator background, are used to probing from what the participants say, not what they do. Body language may be taken into account, but often more for insights into group interaction and mood than real world behavior.
This difference in starting points for probes may be obvious to ethnographers coming from a background of non-intrusive behavioral observation. However, video-based research is providing access to people making purchase decisions in their actual retail outlets, trying a product in their homes, and other real-world observations available to a whole new segment of qualitative researchers. Sometimes participants verbalize what they are thinking about during real world behavior; and sometimes they require probing.
I moderated an online discussion board where participants uploaded video of themselves purchasing tomatoes. In the initial online discussion, a participant talked about tomato selection criteria such as being blemish-free, deep red, and round. When looking at his video, however, I noticed that he was picking up many tomatoes with similar-looking blemishes on them, rubbing the blemish, then selecting some and not others. Asking the participant about this seeming inconsistency between his statements about blemished tomatoes and his behavior appeared to open up a floodgate of recollections about the actual purchase decision. In his real world, the ideal tomato often doesn’t exist and the participant needed to make trade-offs as to which imperfections he would accept. He was feeling the blemishes for inconsistencies in firmness right around the blemish. He used this to judge if the blemish was likely to penetrate deeply into the tomato and impact taste. He also started to recall many details related to optimal shapes for various types of tomatoes that didn’t come out in the initial discussion.
Tip 6: Don’t think of research video content as just feedback to be analyzed separately, but as a stimulus for probes— a stimulus that allows participants to more accurately recall and explain real world behavior.
Tip 7: Specifically look for behaviors that are not verbally described in the video or are inconsistent with verbal descriptions.
Post-Field Processing of Video Footage
A consistent “war story” we heard among early adopters using participant video posts is the time it took to analyze and create deliverables like video highlight reels from the large amount of video footage that can result from these studies. It’s a case where there is so much you can do with the footage…especially with full-featured video-editing software…that the scope of the activity can quickly overwhelm the time budgeted for it.
The good news is there are solutions to this, with the right analysis plan, tools and management of client expectations.
Tip 8: Clearly define how the video upload footage fits into your reporting process. In most cases, you do not need a smoothly produced video with titles, transitions, graphics, music or special effects. You may simply need a highlight reel that provides clusters of examples of key themes or points made in the report.
Tip 9: Manage client expectations as to the deliverables resulting from video footage. If the client is expecting a smoothly produced “movie”, make sure the production costs are in the budget!
Tip 10: Use video-editing tools optimized for the scope of your deliverable. A full-featured video editor may be overkill and cumbersome (especially if you have limited experience with it) if you are trying to create simple highlight reels. This is one reason why itracks integrated a research-report-optimized video editing tool into its recent bulletin board software.
I hope you find these recommendations useful as you explore the use of video content in your own online qualitative research. They are far from a comprehensive list, and the itracks’ team is continuously learning and sharing new best practices from real-world experience. Please feel free to post comments to this article below. We look forward to hearing from you… about your successes and emerging needs in this constantly evolving area.
Dale Watts, Principal of Market Catalyst has over 25 years of primary marketing research experience and provides services in communications testing, new product development research and brand development research. www.marketcatalystresearch.com / email@example.com