Feb
0

Mobile Ethnography: The New ‘Organic’ Market Research Tool to Try in 2014

mobile ethnographyWhat’s the most promising aspect of mobility in market research? Mobile ethnographynot pushing surveys to mobile devices.

Mobile Ethnography: Innovation in Progress

While there are only a few tools available so far, this area is developing quickly.  Imagine being able to ask people to basically research themselves. They can opt-in to a research experience using their mobile phones, take pictures and videos of where they are, capture sound bites as they’re happening, scan barcodes or QR codes of interest, and so forth. Cool? Yes.

So what’s the downside? This market research technique isn’t perfectly controllable. Participants will vary in their adherence to instructions, volume of contributions, and time spent.  There will be inconsistencies, and surprises.

So like anything else, it’s a trade-off. Yes, there are inconsistencies—but for some research needs, mobile ethnography offers superior speed, respondent engagement and ultimately insights.  It’s not as structured as a “conventional” survey, but that’s ok.

Healthier Market Research?

I like organic produce. But it tends to be more inconsistent in appearance than “conventional” options. Similarly, some new ‘organic’ market research tools (like mobile ethnography), are a bit more inconsistent—but perhaps more nutritious. We researchers need to raise awareness with our clients, be they internal or external, that the flaws of some new methods are really cosmetic; that at the heart of new methods, we’re getting something that’s potentially a lot tastier.

Next steps?

Check out some of the early products. Three are below and, when you check them out, you will see they are very different from one another.

  • QualMeetings from 20/20 Research
  • EthOS from EthOS App, a UK-based firm
  • And the folks at MyServiceFellow are offering a free demo (as of January 2014—this may change at any time).

 

[Want to read more about organic market research options? Download our white paper here.]

 

Jan
0

Best Market Research Articles of 2013: Third in a Series of 10

[Research Rockstar interns have written synopses of 2013’s best market research articles, as selected by Kathryn Korostoff. This is the third in our series. This synopsis was written by Research Rockstar intern, Audra Kohler.]

Article: Are you thinking what I’m thinking?

Originally published in: research.

July 30, 2013

Rob Egerton and Jeanette Kaye

Have you ever bought something because all of your friends had it?  While we may be loath to admit it, our actions are swayed by friends, groups, and the public. Perhaps even more so than what we realize.  Because of this reality, the authors of “Are you thinking what I’m thinking?” argue that market researchers need to go beyond the individual to truly understand consumer behaviors.  The authors state that two particular theories should be used more in research to explore the dynamics of influence.

Wisdom of Crowds for Market Research

The author’s first cited theory, wisdom of crowds, was the theme of a popular 2004 book of the same title by James Surowiecki.  The basic premise is that group decision-making or estimation is more accurate than individual decision making.  An example: a group would be more accurate at estimating the number of candy corn in a jar at your annual Halloween get-together, rather than each individual guesstimating separately.  Another researcher, Martin Boon, took this conclusion one step further.

This is where the meat and potatoes lie in this article.  Boon reworks this theory to predict elections.  Based on his research with actual election results, he concludes that averaging a randomly selected sample’s guesses is more accurate than traditional polling methods.  The use of the wisdom of crowd’s theory had two clear distinctions:

  • Individuals were not asked how they were going to vote.  The sample was asked how they thought others would vote.
  • Previous election results were provided to each respondent, which provided a useful context.

Overall, this method proved to be more accurate than traditional polling.

The Theory of Group Behavior for Market Research

In his book “I’ll have What She’s Having,” Mark Earls makes the claim that in determining decisions, the influence of other people is more significant than the actual individual decision maker.  But if you think about it, as market researchers, we are great at knowing the individual and their thought process.  Rarely do we research how individuals behave in a group and how they are influenced by that group.

According to Egerton and Kaye, “…recent behaviors to which we can all relate point to how individuals can be encouraged into actions not by their own assessment of what they should do next, but by the actions of those around them.”  In Earls’ book, he cites the London riots of 2012, laying flowers at traffic accidents or at significant events as examples of group dynamics.

A Powerful Combination for Market Research

By integrating lessons from these two powerful theories, the authors create key market research lessons:

  • Acknowledge.  Realize that there are limitations to looking at only an individual’s behavior.  Behavior of the individual is influenced by group dynamics, the authors argue.
  • Explore.  Although this is difficult, the authors encourage beginning to map out how others influence an individual.
  • Categorize.  Egerton and Kaye cite a TED talk by Dereck Sivers, which gave a high importance to breaking down the behavior of early adopters versus followers. This is one way to start categorizing consumer behaviors by group.

 

Nov
0

Best Market Research Articles of 2013: First in a Series of 10

world-map-with-flags300x197

[Research Rockstar interns have written synopses of 2013’s best market research articles, as selected by Kathryn Korostoff. This is the first in our series. This synopsis was written by Research Rockstar intern, Dan Cleveland.]

 

So Many Variables, So Little Time: A practical guide on what to worry about when conducting multi-country studies

February 12, 2013 RW Connect ESOMAR

Author: Jon Puleston and Mitch Eggers

On a survey, do you check “yes” the same amount as someone in India? Probably not! Cultural differences in multi-country surveys yield inaccurate results. Propensity to agree, untruthfulness, and survey “speeders” vary from country to country.

By geographic region, the authors report that there is a 47% variance on the propensity to say “yes”, a 16% variance to “like” something, a 28% variance to “agree”, a 24% variance to “disagree”, and a 13% variance on neutrality. Interpreting these range scales, there is a significant geographic regional difference in responses. India and China tend to be the most “easy going” on surveys by showing greater positivity, while Northern European and North American responses show the greatest negativity. (Imagine how many more Facebook “likes” there are in India!) It is important to delineate that within same-language surveys, responses may vary between countries due to word interpretation.

Untruthfulness variance is 3 to 30% across all regions surveyed. Regions with “high levels of corruption”, (defined by the World Bank), proved to have greater levels of untruthfulness. (Yes, people from corrupt nations lie more often!) Untruthfulness, however, is not uniform across all types of questions. Questions involving ownership of “high status” items, showed greater propensity for untruthful responses, whereas “routine” item responses showed greater truth. Untruthfulness is mitigated by screening out respondents that lie on routine questions.

According to the authors, respondents that think longer about each question create more accurate survey responses. All regions showed a presence of “speeders”, with an average 85% propensity for speeding on at least one question. Speeders are curbed by questions that are phrased in a personal context and are dissimilar in construction. Speeders generally answer positive, speed on more complex questions, and speed on questions with “natural disagreement”.

Remember to think about “yes” people, liars, and speed demons before you take a “perfect survey” international.

This survey compared 11,000 respondents in 15 countries control group responses to treatment group responses.

 

[Do you have staff that could use some market research training? Check out our self-paced online classes; most are under an hour, and all can be viewed conveniently from any web browser. We’ve even got a self-paced questionnaire design class!!]

 

Jan
2

For Market Research Career Success, Embrace “Less is More”

By coincidence, I read two articles this past week on the theme of, “less is more.” These articles were not specific to market research, yet they do apply.

Less is More, for Market Research Credibility

In the February 2012 issue of Inc. Magazine, Twitter and Blogger co-founder Evan Williams promotes the idea of doing less. That is,  “If you have too many things to think about, you’ll get to the superficial solution—not the brilliant one.”  For we researchers, this is a hard balance. On one hand, we know that the value of analysis seldom comes from focusing on one or two data points—it comes from identifying recurring themes and patterns. Indeed, we often talk about “weaving” together a story from multiple data points.  But we also know that at the other extreme, dumping too much data in a client’s lap, leads to disaster: they turn off, stop listening, and even judge us as unable to prioritize or synthesize, thus hurting our profession’s credibility.  Allowing people to focus more on fewer items, does enhance how market research is perceived.

Less is More, for Richer Market Research Analysis

In the Sunday New York Times (January 20th, 2013 edition), Matthew E. May wrote about, “The Art of Adding Through Taking Away.”  The article points to the strength of this wisdom through ancient proverb and more recently by quoting Jim Collins, who apparently observed that, “A great piece of art is composed not just of what is in the final piece, but equally important, what is not. It is the discipline to discard what does not fit — to cut out what might have already cost days or even years of effort …and marks the ideal piece of work, be it a symphony, a novel, a painting…”

This is so true for those of us who write market research reports. It is always a challenge to hold back—we find so many interesting and tempting data points in a single study. Yet we know that the discipline to reduce our work to its core essence is essential, and will even help us to create more meaningful analyses. A good market researcher will find many interesting things to report; a great one will focus on fewer items but bridge the gap to actionable insights.  The restraint is not easy, but is always rewarded.

 

[Report writing strategies are covered in Research Rockstar’s Project Management class. Next session starts February 28th and meets once a week for 4 weeks.]

Sep
0

Surfing Lessons for Market Research Survey Designers

market research surveyIf you’ve ever seen the movie Forgetting Sarah Marshall, then you are familiar with the line, “do less.”  Chuck (Paul Rudd) uses this line over and over when teaching Peter (Jason Segal) to surf. Chuck goes a little far and Peter ends up boogie boarding, but his point is well taken. Doing too much will result in failure, but doing too little will leave you boogie boarding. When designing a market research survey, it is important to capture a lot of useful information, but how much questioning will be too much for the participants to take? Our mantra is, as Chuck advises, to do less.

Doing Less Will Minimize Your Survey Drop Out Rate

In any market research survey, some participants will drop out, which is just the nature of the beast. The goal is to minimize this drop out rate so that we can meet our overall sample size goals, completely. A well-written questionnaire can limit this drop out rate to lower than 3%, the target number for most market research surveys.  People will be more likely to complete your survey if it is simple; and that means asking clear questions, offering realistic answer options, and using easy scales.

So how simple does it need to be? How do we balance that desire to achieve optimization through simplicity, while meeting the project’s needs for data collection?  While there are many possible factors, here is an often overlooked critical one:

Know how much effort you can ask from this particular group, and design accordingly. In the case of a non-blind survey, the first step is simple; understand the target audience for your survey. Are they loyal customers or a random sampling of consumers?  Loyal customers will be far more willing to take time on the survey and answer more personal questions than random people. However, even loyal customers can be easily scared off by complex surveys.

When crafting your survey, you will surely go through a few iterations. So while revising, be sure to consider the content from a participant’s point of view.  If you were taking this survey, would you feel it was asking for too much information or too much effort? Guiding questions to help you gain a participant’s perspective can be found in Research Rockstar’s class titled “Ask It Right.”  And until then, you can always take a lesson from our surfer friend Chuck, and err on the side of doing less.

 

[Want to learn more about asking the right questions, using the proper scales, and making things easy for participants?  Visit Research Rockstar and take the “Ask It Right” class online in an instructor-led class or at your own pace.]