Dec
0

Video Will Rock the Market Research World in 2015

I’ve been reading a lot of predictions for market research—the typical pontification we see at this time of the year. Some of it has been very inspiring, but too many just rehash the obvious.

Personally, I think there are a lot of interesting theories, a lot of long-term shifts taking place. But as for something we will truly experience in 2015? Something that will really change what we do, how we do it? It’s simple: video. Specifically, video-based methods and video-based reports.

In 2015, we will see a notable spike in use of video IDIs, video focus groups, video ethnography, and video diaries. These methods are superior to others in terms of truly discovering and gauging consumer emotions, aspirations and values. Why is that important? We are more aware than ever before of the limitations of self-reporting such things (thanks to the popularity of research on irrational decision making and behavioral economics). Yet these are exactly where market researchers are often most needed, especially in the face of big data, which increasingly owns the questions “what” and “how”—we market researchers find ourselves increasingly tasked with “why.”

And the video momentum isn’t just about methods, it’s about reporting as well. Video-based reporting will transition from rare to common. Research buyers will increasingly expect video deliverables including video reports, montages and supplemental deliverables.

Remember, “Video killed the radio star?” Well, for market research, video is killing projects and deliverables that don’t capture emotions, convey authenticity, or tell a compelling story. In 2015, video will be the star.

 

[We are so convinced of this trend, we are putting our money on it. Research Rockstar will soon be launching its first class on video-based research methods. Want to take part by being a tester? Email Info@ResearchRockstar.com and request “Video Class Testing”]

 

Oct
2

Article Synopsis: Quantitative or Qualitative Research Methods, Let’s Go Back to the Basics

Quirk’s October 2014
“Quant or qual, let’s go back to the basics”
By Kevin Gray

Kevin Gray’s article is chock full of tips, reminding market researchers to pay as much attention to “how they think” as they do to what research methods they use. He offers his thoughts on what he calls “research thinking.”

Gray breaks research thinking into specific parts: verifying data, defining relationships, understanding and avoiding data interpretation traps, and probabilities versus categories.  In verifying data, not only must researchers be sure to uncover flaws in the raw data, but also be aware of inferring cause and effect relationships. Additionally, when investigating relationships within data, different statistical methods and models can give different readings.  Gray states, “Causation requires correlation of some kind but correlation and causation are not the same.”

When looking at probabilities and categories, Gray cautions the researcher to, “Avoid confusing the possible with the plausible and the plausible with fact. It’s also not difficult, though, to miss something of genuine practical significance that lies hidden beneath the surface of our data.”

Additional tips from the author:

  • Do your homework. Many phenomena have more than one cause.
  • When designing research, first consider who will be using the results, how the results will be used and when they will be used, and then work backward into the methodology. Don’t let the tools be the boss. 
    • This point really resonates; in today’s world, researchers can get distracted by technology that may or may not have merit.  So it is easy to select the shiny new tool even if it is not the right fit.

Two more great tips:

  • Develop hypotheses, even rough ones, to help clarify your thinking when designing research.
    • This  may sound obvious but it is often overlooked. As a result, we have all seen cases of muddy thinking resulting in weak research.
  • Take care not to over-interpret data.
    • Or, as some researchers say, “don’t beat your data to death.”

Gray’s tips are a good reminder to market researchers to be aware of their “research thinking.”

This synopsis was written by Lynn Croft, independent marketing and market research consultant. With 15 years of experience at companies such as Genzyme, Bayer Corporation, Shire, and Eli Lilly, Lynn has expertise in market research and market analysis regarding product launches, pricing and lifecycle management. 

Jul
0

Are Market Researchers Creating the Functional Equivalent of genetically modified food?

CoverScreenshot-rev-210x261Have you ever had the nagging feeling that a research project collected data that was simply not very good? That the respondents weren’t adequately engaged when supplying answers? Or that the method didn’t gather enough meaningful context?

In market research, our convention is to conduct studies that employ imposed calibration.  Our studies often capture and measure attitudes and behaviors, as if they could all be sorted into neat packages. We carefully structure our questions, and in the case of survey research, even our answers. We use quotas, we use weighting.  But are we creating the functional equivalent of genetically modified food?

For some research needs (some, not all), it is time to think about options other than surveys, IDIs and focus groups. New methods such as webcam research, idea voting sites and online projective methods, to name just a few.

But newer market research methods are unpredictable and can compromise on demographic profiling—just like organic produce sometimes has a few more blemishes or less consistent coloring.  And these differences can make us research types uncomfortable. Heck, it makes me uncomfortable. But now that I have had the opportunity to test some of these new methods, I am also excited about their value.

My experiences have also let me to author a short white paper, “Organic Market Research: Avoiding Overly Contrived Data.” Please click here to download it.

 

Aug
0

When Breaking Up (Market Research Interviews) is Hard to Do

sunk costsIn economics there is a term known as “sunk cost.”  Investopedia defines a sunk cost as, “A cost that has already been incurred and thus cannot be recovered.” A cost does not have to be monetary either; it can be thought of in terms of time, resources, or anything else of value to a company. Business decisions are made independent of sunk costs. At first glance, this may seem a little ridiculous. If you have put hundreds of hours or thousands of dollars into a project, then you should work tirelessly to make it succeed, right? Well, the brutal reality is that sometimes we just have to accept that a project has gone bad, and it is time to move on. It can be best to just cut your losses and not lose any more money.

This idea of sunk costs applies to market research interviews (or in-depth interviews, IDIs) as well.  As does the concept of knowing when to walk away.

In-depth interviews can be an immensely valuable research methodology, and many market researchers use this tried-and-true approach (see some of the reasons why in this article from Quirk’s Marketing Research Review). But when conducting IDI research, a lot of time goes into planning before the actual interview is conducted.

Once a firm has spent all kinds of money and devoted countless hours to developing the IDI guide and screening participants, they should not waste a single opportunity right? Shouldn’t every scheduled IDI be completed fully? Not always: money and time at this point is a sunk cost. The goal now is to conduct as many good in-depth interviews as possible.  Wasting time on a bad interview just frustrates the interviewer and wastes time that could be better used elsewhere, so why bother? Unfortunately, in the quest to meet sample size goals and “not waste” sunk costs, too many researchers end up completing bad interviews.

So here is the critical question: how does one determine what is a bad interview and what is not? How long into an interview is it before it is possible to tell it is not worthwhile? What is the best way to end a bad interview? Each interview will be different and you will have to make a judgment call, however, Research Rockstar’s class on Conducting Research Interviews can provide you some valuable guidelines and tips for handling these unfortunate situations.

[Want to learn 12 valuable steps for stress free interviewing?  Visit Research Rockstar and take their self-paced, online class on Conducting Research Interviews. This class is available as both a self-paced and an instructor-led format: click here for dates!]

 

Mar
0

Market Research Policies

Do you cringe when you hear the word “policies”? Most people do. After all, policies often mean bureaucracy.  But in the case of market research, clear policies will minimize the risk of data quality headaches, customer over-surveying, ethical breaches and more.

Indeed, a thoughtful, well-communicated set of policies is more critical today than ever before, with so many people conducting ad hoc or “DIY” research. Well-intentioned individuals often make mistakes that could be avoided through awareness of a simple set of company-wide market research policies. Even organizations with central market research departments find it challenging to control “rogue” research—but promoting a set of policies will help minimize the risks.

Below are examples of market research policies that will promote basic, best practices:

  1. Frequency. Over-surveying can lead to customer frustration and ultimately, poor response rates. Thus, a key policy is to specify how many times a year a single customer can be invited to participate in research.  Three times? Five times? There is no “right” answer for all organizations—it varies by customer type. But a rule should be in place. In this way, employees can avoid inundating customers with volumes of survey requests.  Of course, this also requires having a mechanism in place to track this.
  2. Quality. All direct communications coming from your company are indicators of your brand’s quality, and surveys are no exception.  You must ensure that a kind of “quality control” resource exists to ensure that nothing sub-par gets released.  This job includes checking grammar and questioning content and logic.  For example, one common complaint about colleagues who do ad hoc research is that they may ask too many intrusive questions (a big turn off for customers). This resource could be a person, a team, or a defined process.
  3. Permissibility. The best way to prevent unsanctioned surveys is to make sure everyone knows how to request and get approval for market research projects.  Your company can specify what types of research must be done through central market research (if it has such a department) and what can be done by other functional areas.  A simple research request process should be in place so that employees can submit a standard form that can be used to trigger an assessment and approval process.  Too onerous? Then how about a simple policy stating, “Any surveys over 10 minutes in duration must be approved by the central market research (or if none exists, marketing) department– no exceptions.”
  4. Methods. Company guidelines should state policies for both qualitative and quantitative methods. For example, “All online surveys must be fewer than 30 questions.” Or, “Recruiting customers for in-depth interviews must be coordinated with the VP of sales at least two weeks ahead of time.”  These are just two simple examples, but you get the idea.
  5. Incentives. An incentive policy should include guidelines for types of incentives and under what circumstances they can be given out.  Inform your employees ahead of time about whether or not your company restricts cash incentives or any type of “gifts” to customers.
  6. Solicitation. A strict non-solicitation policy must be in place. Selling “under the guise of research” is entirely unethical and must be avoided. Even the appearance of solicitation can lead to big problems for your company. Surveys must not be used as thinly veiled lead generation mechanisms. [Click HERE to get more tips on survey design.]
  7. Confidentiality. A confidentiality policy will ensure your employees understand how to use research information responsibly and will show your clients that you value their privacy.  Obviously, it is essential that confidential information is protected, so train people on what information is confidential, how it should be stored, and how it should be treated (internally and externally). Another realm of confidentiality lies in what company information is shared in a research study.  Consider rules that will avoid unwanted leaks. For example, a policy may be that any research related to new product concepts must be approved by the VP of marketing.

Market Research Training Via Policies

While these simple policies may appear obvious to an experienced researcher, it is important to present them to all research-related colleagues. Include policies in employee orientation materials and provide reference materials for all employees who may in any way touch market research—whether it’s the DIY kind or not. Just by raising awareness that there are policies, you will be providing subtle training on best practices.

[Do you have staff that could use some market research training? Check out our online classes; most are under an hour, and all can be viewed conveniently from any web browser.]