Article Synopsis: Still Full of Beanz (Effective Data Management)

How does a 150-year-old company stay relevant?

Originally published in Research Magazine July 9, 2014

By Lucy Fisher

Writer Lucy Fisher asks Colin Haddley, director of strategy, insight and capability at Heinz, “How does Heinz, a 150 year old company, stay relevant with consumers in a competitive market?”, the answer is research.  Innovation doesn’t just happen, “Generating great ideas is essential in marketing, but to generate these ideas you need to be disciplined in your approach,” Haddley points out. Managing market research data efficiently is the key.

Using a philosophy of test and learn, Heinz looks to multiple information sources for research, including electronic-point-of-sale, Nielsen data, panel data, and social media and brand monitoring.  One such panel, Heinz 57, is an online community of 300 consumers that the company uses as a source of customer feedback.

How then does Heinz manage all this data and turn it into successful marketing strategies? With customer insight teams of marketers trained in innovative thinking.  However, a big challenge is integrating the sources of data, and not focusing on any one source of insight. “Penetrating, meaningful insights are derived, felt and observed through a variety of sources of information. It is like building a jigsaw… it all starts with effective data management,” Haddley says.

By piecing all the research together, from the different sources, relevant pieces of customer insight emerge: what consumers like, do not like, want more of, or think is a flaw. “But it all starts with effective data management,” Haddley cautions. While the article didn’t specifically address how different types of market research data are integrated (perhaps that’s a recipe too dear to share), it’s still a great real-world glimpse into the value of leveraging multiple information sources.

This synopsis was written by Lynn Croft, independent marketing and market research consultant. With 15 years of experience at companies such as Genzyme, Bayer Corporation, Shire, and Eli Lilly, Lynn has expertise in market research, market analysis regarding product launches, pricing and lifecycle management. 


[Is your quantitative market research data collected & ready for analysis? Now what? Check out Research Rockstar’s real-time, online training program “Introduction to Quantitative Data Analysis” for help getting started. MRA PRC approved for 6 hours.]



Market Research Lessons from Edward Snowden

Love him or hate him, Edward Snowden is a catalyst for change.

How did he do it? And what can we market researchers learn from it?

The Big Reveal Gets Big Attention

Snowden didn’t suggest that there might be an issue. He didn’t send out a 50 slide PowerPoint. He didn’t bury his key point on a slide with 4 other “results.” He had a single message, and it was bold:  he stated that there was massive, secret surveillance of US citizens.

Would his message been as powerful if he revealed three other accusations at the same time? Probably not.

As researchers, we often have several key conclusions from a study. On one hand, that shows a good ROI; our client paid $X, and got several key take aways. But should there be a “primary” conclusion? Should we be doing a better job of creating a “star” result, something that is likely to be shocking enough to grab attention, without be diluted by other points?

Multiple Proof Points Create Legitimacy

Snowden didn’t just leak one document or one piece of data. He had many documents to support his conclusion (different news reports vary as to the exact number, but the range is 20,000 to over a million documents). I bet very few people actually looked at the documents; just seeing that there were lots of them was convincing.

Would his accusations have been as compelling if he only had one document? Probably not.

In research, we often draw conclusions from a single study. We do a survey, focus group, ethnographic study or some other method, and then deliver the conclusions from that single study. It can be hard for clients to trust data from a single study, no matter how compelling.  Can we be doing a better job of using multiple data sources?

Market Researchers versus Whistleblowers

Okay, delivering market research results is quite different than releasing classified national security material. But these two lessons do apply, especially if our goal is to have our research truly get attention. If you were a whistleblower, how would you build your case? Prepare for your big reveal? And could you be doing the same things when you are preparing to deliver your next research-based insights?


[New to Research Rockstar? Interested in market research training? Try a free sample class. Sign up today: Sample Class Offer]


Mobile Ethnography: The New ‘Organic’ Market Research Tool to Try in 2014

mobile ethnographyWhat’s the most promising aspect of mobility in market research? Mobile ethnographynot pushing surveys to mobile devices.

Mobile Ethnography: Innovation in Progress

While there are only a few tools available so far, this area is developing quickly.  Imagine being able to ask people to basically research themselves. They can opt-in to a research experience using their mobile phones, take pictures and videos of where they are, capture sound bites as they’re happening, scan barcodes or QR codes of interest, and so forth. Cool? Yes.

So what’s the downside? This market research technique isn’t perfectly controllable. Participants will vary in their adherence to instructions, volume of contributions, and time spent.  There will be inconsistencies, and surprises.

So like anything else, it’s a trade-off. Yes, there are inconsistencies—but for some research needs, mobile ethnography offers superior speed, respondent engagement and ultimately insights.  It’s not as structured as a “conventional” survey, but that’s ok.

Healthier Market Research?

I like organic produce. But it tends to be more inconsistent in appearance than “conventional” options. Similarly, some new ‘organic’ market research tools (like mobile ethnography), are a bit more inconsistent—but perhaps more nutritious. We researchers need to raise awareness with our clients, be they internal or external, that the flaws of some new methods are really cosmetic; that at the heart of new methods, we’re getting something that’s potentially a lot tastier.

Next steps?

Check out some of the early products. Three are below and, when you check them out, you will see they are very different from one another.

  • QualMeetings from 20/20 Research
  • EthOS from EthOS App, a UK-based firm
  • And the folks at MyServiceFellow are offering a free demo (as of January 2014—this may change at any time).


[Want to read more about organic market research options? Download our white paper here.]



Best Market Research Articles of 2013: Third in a Series of 10

[Research Rockstar interns have written synopses of 2013’s best market research articles, as selected by Kathryn Korostoff. This is the third in our series. This synopsis was written by Research Rockstar intern, Audra Kohler.]

Article: Are you thinking what I’m thinking?

Originally published in: research.

July 30, 2013

Rob Egerton and Jeanette Kaye

Have you ever bought something because all of your friends had it?  While we may be loath to admit it, our actions are swayed by friends, groups, and the public. Perhaps even more so than what we realize.  Because of this reality, the authors of “Are you thinking what I’m thinking?” argue that market researchers need to go beyond the individual to truly understand consumer behaviors.  The authors state that two particular theories should be used more in research to explore the dynamics of influence.

Wisdom of Crowds for Market Research

The author’s first cited theory, wisdom of crowds, was the theme of a popular 2004 book of the same title by James Surowiecki.  The basic premise is that group decision-making or estimation is more accurate than individual decision making.  An example: a group would be more accurate at estimating the number of candy corn in a jar at your annual Halloween get-together, rather than each individual guesstimating separately.  Another researcher, Martin Boon, took this conclusion one step further.

This is where the meat and potatoes lie in this article.  Boon reworks this theory to predict elections.  Based on his research with actual election results, he concludes that averaging a randomly selected sample’s guesses is more accurate than traditional polling methods.  The use of the wisdom of crowd’s theory had two clear distinctions:

  • Individuals were not asked how they were going to vote.  The sample was asked how they thought others would vote.
  • Previous election results were provided to each respondent, which provided a useful context.

Overall, this method proved to be more accurate than traditional polling.

The Theory of Group Behavior for Market Research

In his book “I’ll have What She’s Having,” Mark Earls makes the claim that in determining decisions, the influence of other people is more significant than the actual individual decision maker.  But if you think about it, as market researchers, we are great at knowing the individual and their thought process.  Rarely do we research how individuals behave in a group and how they are influenced by that group.

According to Egerton and Kaye, “…recent behaviors to which we can all relate point to how individuals can be encouraged into actions not by their own assessment of what they should do next, but by the actions of those around them.”  In Earls’ book, he cites the London riots of 2012, laying flowers at traffic accidents or at significant events as examples of group dynamics.

A Powerful Combination for Market Research

By integrating lessons from these two powerful theories, the authors create key market research lessons:

  • Acknowledge.  Realize that there are limitations to looking at only an individual’s behavior.  Behavior of the individual is influenced by group dynamics, the authors argue.
  • Explore.  Although this is difficult, the authors encourage beginning to map out how others influence an individual.
  • Categorize.  Egerton and Kaye cite a TED talk by Dereck Sivers, which gave a high importance to breaking down the behavior of early adopters versus followers. This is one way to start categorizing consumer behaviors by group.



Best Market Research Articles of 2013: First in a Series of 10


[Research Rockstar interns have written synopses of 2013’s best market research articles, as selected by Kathryn Korostoff. This is the first in our series. This synopsis was written by Research Rockstar intern, Dan Cleveland.]


So Many Variables, So Little Time: A practical guide on what to worry about when conducting multi-country studies

February 12, 2013 RW Connect ESOMAR

Author: Jon Puleston and Mitch Eggers

On a survey, do you check “yes” the same amount as someone in India? Probably not! Cultural differences in multi-country surveys yield inaccurate results. Propensity to agree, untruthfulness, and survey “speeders” vary from country to country.

By geographic region, the authors report that there is a 47% variance on the propensity to say “yes”, a 16% variance to “like” something, a 28% variance to “agree”, a 24% variance to “disagree”, and a 13% variance on neutrality. Interpreting these range scales, there is a significant geographic regional difference in responses. India and China tend to be the most “easy going” on surveys by showing greater positivity, while Northern European and North American responses show the greatest negativity. (Imagine how many more Facebook “likes” there are in India!) It is important to delineate that within same-language surveys, responses may vary between countries due to word interpretation.

Untruthfulness variance is 3 to 30% across all regions surveyed. Regions with “high levels of corruption”, (defined by the World Bank), proved to have greater levels of untruthfulness. (Yes, people from corrupt nations lie more often!) Untruthfulness, however, is not uniform across all types of questions. Questions involving ownership of “high status” items, showed greater propensity for untruthful responses, whereas “routine” item responses showed greater truth. Untruthfulness is mitigated by screening out respondents that lie on routine questions.

According to the authors, respondents that think longer about each question create more accurate survey responses. All regions showed a presence of “speeders”, with an average 85% propensity for speeding on at least one question. Speeders are curbed by questions that are phrased in a personal context and are dissimilar in construction. Speeders generally answer positive, speed on more complex questions, and speed on questions with “natural disagreement”.

Remember to think about “yes” people, liars, and speed demons before you take a “perfect survey” international.

This survey compared 11,000 respondents in 15 countries control group responses to treatment group responses.


[Do you have staff that could use some market research training? Check out our self-paced online classes; most are under an hour, and all can be viewed conveniently from any web browser. We’ve even got a self-paced questionnaire design class!!]