Aug
0

Article Synopsis: Listening In on Social Media: A Joint Model of Sentiment and Venue Format Choice

By David A. Schweidel and Wendy W. Moe

Journal of Marketing Research

Published online June 19, 2014

Does brand sentiment vary by social media platform? According to research by David Scweidel and Wendy Moe, the answer appears to be yes. The authors discuss the results of their 2014 study, in which they modeled previous data collected from different social media venues (blogs, Facebook, Twitter, etc.), in an effort to determine if consumer brand sentiment varies by venue type.

In this article, Scwheidel and Moe uncover the risks of using social media metrics without accounting for differences across venues. Previous studies had already suggested that what people say in a post is related to where they post. In this study, the authors set out to examine this further and to find out why blogs have the most positive posts, forums the second most positive posts, and microblogs the least. The result? The article points to two factors: 1) Consumers often choose to participate in online communities whose member share their interests and opinions; and 2) The limitation on numbers of characters allowed has impact on the opinion expressed. Forums and blogs, (Facebook) because they are lengthier, allow for more expression. Additionally these sites actually expose posters to more varying amounts of social dynamics, including peer pressure, etc. On micro blogs, (Twitter) where text is limited, consumers tend to post more extreme opinions so that they can convey their perspective succinctly.

The results from the authors’ modeling shows that the inferences marketing researchers obtain from monitoring social media are dependent on where they “listen.” For example, Facebook tends to be more positive, and Twitter more negative in the opinions expressed, based on the dynamics stated in the above paragraph. Common approaches that either focus on a single social media venue or ignore differences across venues in aggregated data, can lead to misleading brand sentiment metrics. The authors conclude “… the current research demonstrates the potential for social media monitoring to supplement “market” research programs, but further investigation using both social media and survey data from a range of categories is essential before market researchers can rely exclusively on social media for customer insights.”

1Muniz, Albert M., Jr., and Thomas C. O’Guinn (2001), “Brand Community,” Journal of Consumer Research, 27 (4), 412–32.

 

This synopsis was written by Lynn Croft, independent marketing and market research consultant. With 15 years of experience at companies such as Genzyme, Bayer Corporation, Shire, and Eli Lilly, Lynn has expertise in market research, market analysis regarding product launches, pricing and lifecycle management. 

 

[Want to learn more about social media monitoring, social sample sources, and more? Get a practical perspective on how you can use social media in your market research projects in our 90-minute, live, online Social Media Meets Market Research class. MRA approved for 1.5 hours of PRC credit.]

 

Jul
0

Article Synopsis: The High Price of Customer Satisfaction

MIT Sloan Management Review

March 18, 2014   Magazine: Spring 2014

Timothy Keiningham, Sunil Gupta, Lerzan Aksoy and Alexander Buoye

Highly satisfied customers = revenue dollars. Or do they?  Some data has shown that the relationship between customer satisfaction and customer spending behavior is surprisingly weak. 1 In this article, the authors share their analysis of the relationship between satisfaction and business outcomes, gathering data from more than 100,000 consumers covering more than 300 brands.   This data came from two sources, the American Satisfaction Index data (2000-2009) which are measures of stock returns, appended with market shares of these companies, and consumer satisfaction ratings and customer spending levels across 315 brands.2

This analysis revealed three critical issues that have an impact on correlating customer satisfaction to positive business outcomes.  1) There is a downside to continually devoting resources to raise customer satisfaction levels; 2) High satisfaction is a strong negative predictor of future market share; 3) Knowing a customer’s satisfaction level tells you almost nothing about how customer spending will be divided among the different brands used.

The authors share strategies to align customer satisfaction and profitability that companies should understand and implement as follows:

“Value to the Company vs. Value to the Customer—research and analyze your customers’ satisfaction levels with your product to the product’s profitability.”

“Market Share vs. Customer Satisfaction—begin with an analysis of customers’ satisfaction levels with not only your company but also with your competitors, as well as your and your competitors’ market shares.”

“Satisfaction and Customer Advantage—what really matters is whether or not your customer satisfaction rating is higher for your brand than for competing brands that a customer also uses.”

The authors conclude that increasing satisfaction levels can be a component of a company’s strategy, but perspective is needed.  In fact, a company may need to accept lower satisfaction scores from a smaller group of customers, in order to increase market share within a larger less homogenous group.  For researchers conducting customer satisfaction research, this context provides some fresh inspiration about how to weave conventional satisfaction research with additional data sources.

References

1 J. Hofmeyr, V. Goodall, M. Bongers and P. Holtzman, “A New Measure of Brand Attitudinal Equity Based on the Zipf Distribution,” International Journal of Market Research 50, no. 2 (2008): 181-202; and A.W. Mägi, “Share of Wallet in Retailing: The Effects of Customer Satisfaction, Loyalty Cards and Shopper Characteristics,” Journal of Retailing 79, no. 2 (2003): 97-106.

2 Some examples cited include: L. Aksoy, A. Buoye, P. Aksoy, B. Larivière and T. L. Keiningham, “A Cross-National Investigation of the Satisfaction and Loyalty Linkage for Mobile Telecommunications Services Across Eight Countries,” Journal of Interactive Marketing 27, no. 1 (February 2013): 74-82; Aksoy et al., “Long-Term Stock Market Valuation”; and others.

 

This synopsis was written by Lynn Croft, independent marketing and market research consultant. With 15 years of experience at companies such as Genzyme, Bayer Corporation, Shire, and Eli Lilly, Lynn has expertise in market research, market analysis regarding product launches, pricing and lifecycle management. 

 

[Are you planning your organization’s first customer satisfaction research? Or looking to refresh an existing program? Learn about goal setting, monitoring strategies, and common challenges in our 90-minute, live online Improving Customer Satisfaction class. MRA approved for 1.5 hours of PRC credit.]

 

Apr
0

The 4 Killer Stats from the ESOMAR 3D Conference

esomar-logo

 

 

In catching up on market research reading, we stumbled on this little gem from Question Science BlogspotIn this article, Jon Puleston tells us about some surprising statistics he overheard while attending the ESOMAR 3D conference at the end of 2012:

350 out of 36,000Porsche culled through 36,000 social media responses and found that only 350 were “useful”. Significantly, all of the comments were processed manually. This suggests that deciphering data from social media could be a poor investment.  So, can text analytics software accurately decipher social media comments, and are the comments even worth deciphering?   Clearly, this is going to vary by topic, brand in question and scope. Some brands/keywords get a lot more “garbage” than others.  What we have found here at Research Rockstar is that you have to do some serious testing of your topic/brand name/keywords of interest before you invest significantly in social media analysis.

240 hours– The amount of time spent by a market research firm analyzing text from 1000 Facebook users.

.18—A survey by Jannie Hofmyer and Alice Louw from market research company TNS, showed a surprising lack of correlation between “aided awareness of a brand & purchase activity”. Their research revealed that surveys are routinely constructed incorrectly and contain questions that are incapable of measuring behavior. Customers and non-customers of products should take different surveys to create relevant survey data results.

50%Peit Hein van Dam, from digital tracking company Wakoopa, tracked a 50% variation between the claimed readership level of a Dutch newspaper and the readership level tracked on mobile devices and computers. “Cookie” tracking proves to be largely inaccurate in counting unique visitors and web traffic.

 

[Want access to more market research articles and training materials?

Sign up for the Research Rockstar newsletter: SIGNUP]

Jun
0

An Open Letter to Market Research Software Companies

Market Research SoftwareIdea voting. Prediction markets. Online surveys. Crosstab analysis. Text analytics. Social media research. What type of market research software do you sell?

We teach our students about lots of cool market research options, and whenever possible we like to include demonstrations. In fact, these demonstrations are so loved by our students, that many have asked us to add more to our market research training classes’ content.

So to those of you selling market research software and tools, consider this an opportunity to get exposure with a group of career-minded professionals, as they take part in our market research training classes. These are people who are spending time on professional development, and who work for companies that are investing in their training. Our client base is diverse, and includes both market research agency and client-side professionals. Some are newer in their career paths, others have 15+ years’ of experience and come to us for a “refresher” or skills extension. All are engaged learners.

To have your product considered for use in one of our classes, please email a note to Demos@ResearchRockstar.com.

 

Aug
1

Survey Template: Gauging Brand Perception

What does your target market think of your brand?

How does your target market perceive your brand as compared to your competitors’ brands?

While brand research can be a very complex, exhaustive exercise, in many cases a simple approach may suffice. 

If you plan to do your own brand perception research using online surveys, here are some tips.

How your brand is perceived

For brand perceptions, a quick and easy way to collect data is to ask, “Which of the following words would you use to describe our company?” Then give them a list of varying words and allow them to pick up to three. It’s a simple format for the respondents, and gives you very useful insights.  Do people think of your brand as “smart” and “fun” or “stable” and “safe”? Are your competitors perceived as “friendly” and “creative” or “slow” and “boring”?

Other perceptions that we commonly seek to measure in research:

  • This is a company that values its customers
  • This is an innovative company
  • This is a company that offers products or services that are a good value (or a good value for the dollar)

These types of brand questions are going to vary by product category and target market. B2B companies will have very different questions than B2C, and so on.

Brand Perception Research, Realistically

In an ideal world, a company would do very comprehensive brand perception research. But that type of time, and budget, is not always an option. With some careful planning, many companies can learn quite a lot from a short, online survey approach.

If you’d like to receive more free Market Research tips, click HERE to sign up for Research Rockstar’s Market Research Newsletter.