Return on Behavior Magazine
Home for marketing and customer service professionals



Marketing

September 26th, 2009

Market Research - Making It Useful, Not Just Interesting

A wise but anonymous marketer once said that a market research report that gets described as “interesting” has failed. It’s only when it’s “useful” that it gets the pass mark.

After all, what’s the point of interesting research if it can’t be put to use?

The sad truth is that most market research is not very useful and more often than not ends up as a door stop for the marketing manager’s office. Not to mention that large-scale research with qual and quant phases is damned expensive.

It’s also easy to forget the hassle it puts customers through. Rarely does the committed customer who responds to the survey ever hear anything back or see any tangible differences. No matter how loyal they are, next time they’re likely to say no when a researcher comes knocking/ringing/emailing.

We put poor usefulness down to two factors:

  1. Poor explanation: Survey research doesn’t explain much, but to be actionable the research must explain why things are happening.
  2. “Dragnetting”: This is where the users of the research are part of the problem. Too little work is done before and after the research to get ducks in a row to ensure something gets done. The common dragnetting attitude is “let’s do some research, see what it tells us, and then decide what to do.” In other words, fuzzy prep leads to fuzzy results.

Poor Explanation

So often, companies find that market research results do not align with frontline reality and financial results. Sampling error, poor response rates, and poor questionnaire design combine to provide results that may fluctuate wildly and leave the customer with no understanding or explanation for why.

With no explanation for their results, the hapless research company is left floundering, trying to justify itself and its results. Survey results get “taken with a grain of salt” by managers (that is, just accept the results that suit you and ignore those that don’t).

Despite this, many companies are addicted to expensive large-scale sample surveys, valiantly trying to use the results to measure the success of their efforts and to guide decision-making.

The question is, Can research users wean themselves off those volumes of statistics and graphs? To do so requires a letting go of using market research as a substitute for common sense, a focus on what the research is meant to achieve, and a commitment to acting on the voice of the customer.

A “longitudinal” approach to research provides far greater explanation than the traditional “cross-sectional” big-sample method. The concept is to look along a customer relationship or experience (hence “longitudinal”) by re-contacting the same respondents to see how they think things have changed, if they have, and their explanation for why.

In contrast, typical market research depends on samples (cross-sections) and so talks to different people each time. The opportunity to explain change by directly asking customers once and then again later is lost, so complicated statistics elastoplast up the situation.

Take the example of a service business that has a strategy to differentiate itself by having staff provide insights and added value at every customer encounter. Rather than commissioning a quarterly random survey of customers and waiting to see whether the customer satisfaction scores move, it recruits a rolling panel (one of many tools in the longitudinal research armory) of customers who agree to be re-contacted:

  • Through re-contacting the same customers and comparing their answers this time and last time, real changes in their experience are measured and validated and reasons understood—what changes have they noticed in the staff and what do they like or not like about what they experience?
  • Emerging issues are immediately communicated to decision makers in actionable language, enabling rapid response—what can staff do now to add value?
  • Intentions are tracked against reality and changes in attitude correlated with individual customer profitability—is the strategy actually contributing to winning more business and growing the bottom line?
  • Customers have the opportunity to provide feedback, rather than being forced to respond to batteries of rating statements, many of which may be irrelevant to them—maybe they do think staff have become more knowledgeable, but is the constant contact becoming irritating?

Since useful research explains patterns in a simple fashion in addition to describing them, another benefit of this approach is short focused questionnaires: lean and smart. Imagine a five-question survey that literally takes five minutes and delivers more than the typical questionnaire 10 times its size! It costs less, it is easier on customers, and it shows them they are being listened to.

Dragnetting

In the absence of robust processes before and after the research itself, many companies end up with unwieldy and poorly focused results. The telltale sign is a massive “drag-net” questionnaire—that is, one designed to dredge out any information that might be there.

It’s unfair to criticize market research managers who do the best job they can to put a good brief together for the research company, but both parties see themselves as information providers, not change makers. Even if they had the skills to address the dragnetting problem, they don’t usually have the mandate.

The underlying problem is at the senior-management level, where there’s a lack of “strategic logic” in the area to be researched.

If there is poor unanimity around deeper beliefs about the market and customer dynamics, then how can the research team decide what to leave out? As the old adage goes, “Good strategy is as much about what doesn’t get done as what does get done.”

The “two birds with one stone” strategic logic is a good example (see the marketing profs.com article by Price and Schultz for more details). A senior management team needs to be clear about its customer management beliefs. Two birds with one stone makes the clear argument that a company must simultaneously attend to the basics that annoy customers and focus on one point of difference (a “spike”) to address customer ambivalence—simple, compelling, and powerful.

If that is what senior managers believe—and that’s confirmed in a definite project process step before the questionnaire is developed—leaner and smarter research configured around those beliefs becomes possible. A strategically coherent questionnaire with fewer questions is less onerous on the customers who have to answer it and tells a much more focused and compelling story to the users of the research inside the company.

To avoid dragnetting, managers commissioning market research should also “write the report first.” This is a simple pre-survey process that encourages involved managers to imagine (and record) what they think the research is going to say and what they would do to respond to the most likely research outcome scenarios.

There are two big benefits from doing this in a more structured fashion than usual:

  • First, it deals with the “confirms everything we already knew” syndrome. Ever been to a research presentation where people in the audience say the results confirm everything they knew already, when it’s pretty clear that there’s precious little agreement among the group about what they expected? If the audience is simply asked to second-guess the results beforehand, then it is very “useful” to see how the actual results map against what was expected. The typical result: The expectations of the group as a whole are wildly disparate and more people are wrong than right.
  • Second, some preplanning can go into implementation. Most market research fails because nothing gets done, despite best intentions. This happens because senior managers as a group don’t discuss and agree early enough how they would respond to each of the likely research result outcomes, should they arise. There two parts to this preplanning:
    1. Figuring out what to do. A work process that uses the top-down, bottom-up principle is usually best. Senior management provides a guiding brief for those charged with figuring out what to do. That provides top-down leadership but doesn’t go so far as to say what to do. For the purposes of getting buy-in and to ensure recommendations are in touch with the real world, that “what to do” is best done bottom-up, as long as the boundaries are clear. For example, a work process that adheres to two birds with one stone is very effective when one group concentrates on fixing the basics that are not being delivered (the “basics” group) while the “spikes” group looks only at strategies to combat customer ambivalence by building a “spiky experience.”
    2. Figuring out how to do it. Most companies really struggle with this because they have not spent time building a shared view of how to make things happen—an “implementation model.” The best laid plans will fall over if it’s not clear from the outset what implementation model is to be deployed. Is it best to “blitz” it, or does the “viral” approach work best? Is a pilot viable? Who should champion it? Who should sponsor? What resources are actually available, especially for project management?

Finally, all customer research relies on the goodwill of customers. Usually a weak effort is made to make research a positive experience for them. For most customers, the bar is so low from other research they have been involved in, that it is quite easy to create a dialogue with customers—show them how you value their input and update them on progress.

One of the biggest unspoken errors in research is non-response bias. If 80% of all customers refuse to be involved in the research process, what might they have said and what does their non-response say about how useful they think the research is?

Useful research gives managers the explanation they crave—what’s happening and why—but rarely get. To deliver that explanation, market research needs to look along a relationship, not cut across it.

It’s not just about providing information, it’s about providing impetus. Senior managers need to see the research process as merely a tool within a wider change process. A little more structure before and after the questionnaire pays big dividends—in the form of research that gets put to use!


About the Author

Neil Stewart

Reg Price, Katie Shaw & Neil Stewart: Reg, a customer strategist, is principal of Coulter Price Associates (reg.price@managepromises.com). Katie is head of strategy and projects, Institutional and Corporate & Commercial Banking, at a major bank in Asia/Pacific. Neil, a relationship and customer experience research consultant, is principal of SRD Group (Neil.S@srd-grp.com). 

Visit: www.managepromises.com

Visit: www.srd-grp.com 

Reg Price

Reg has practiced in the area of customer management as a consultant for almost 20 years. Born in New Zealand, his consulting work has spanned many industries and countries, including NZ, Australia, US, Singapore and UK.  After working in London, he is now based mainly out of Singapore and Auckland.

Reg has taught specialist programs at several business schools and has published several articles in marketing periodicals.  He is lead author of an article coming out this month in the US based magazine “Marketing Management” and currently writing a book for a US publisher on Promises Management.  Reg consults with SRD Group in several areas where he is regarded as a leading specialist advisor, including Customer Experience Design, Customer Orientation Culture Development, Key Client Management, CRM and Promises Management.

Email: reg.price@managepromises.com

Visit: www.managepromises.com/

Neil Stewart

Neil is the Managing Director and co-founder of SRD Group, which was established in 1996. Neil holds a Bachelor of Arts degree in Human Movement Studies (Rhodes University, South Africa), a post graduate degree in Marketing Management (Unitec, New Zealand) and a Diploma in Adult Education (Cambridge University, UK).

Formerly with Sandoz in South Africa, Shell and Arthur Andersen in the UK and then with SmithKline Beecham (now GlaxoSmithKline) in New Zealand, Neil has a wealth of experience in Sales and Marketing Management, Customer Relationship Management, Customer Experience, Change Management and Training Programme Development.

Neil has been involved with and consulted on many CRM and Customer Experience projects, the majority of these for multi-national companies in Australia and New Zealand in the Pharmaceutical, Medical, Finance, FMCG, Airline, IT, Document Management and Manufacturing sectors.

Email: neil.s@srd-grp.com

Visit: www.srd-grp.com  

Katie Shaw

Katie is  the Senior Channel Manager, Internet Banking  Direct and Emerging Channels, at a major bank in Asia/Pacific






 
 

 
standing out_edited-1

The 15 business benefits of a loyalty initiative

When justifying a loyalty initiative, too many executives focus ONLY on the financials, but there are 15 major business benefits - each a competitive advantage - that only a loyalty initiative can provide. Many of t...
by Peter Clark
0

 
 
10 resolutions

10 Customer Experience Management Resolutions for 2011

10 steps to creating a successful customer feedback program that you can start doing straight away to produce results! 1. Plan the project and get top-level management support  Establish a vision of customer experience for...
by TeleFaction A/S
0

 
 
5minutes value

5 minutes on the value of customer service

Lee Martin, the Managing Director of Toojays Training & HR Consultancy looks at how valuable customer service is, and what steps you can take to change. Good customer service is the lifeblood of any business. However, in a ...
by Lee Martin
0

 

 

Five Incentives for Customer Experience Focus

Fredrik Abildtrup, CEO of TeleFaction guides you through techniques that can be implemented by looking at customer experience in a whole new light. It is said that you need to be more conservative in the times of a crisis. This...
by Fredrik Abildtrup
0

 
Advertisement
 
Apostles2

Apostles and Terrorists – Understanding Customer Loyalty

We love this model – created by Harvard Business School professor James Heskett and developed by the likes of Jones, Sasser, Xerox and Intuit. This can be used in so many situations including projects where you may not appear...
by Peter Stansbury
0

 

 

Forget the crisis! Be S.M.A.R.T. !

It is said that you need to be more conservative in the times of a crisis. This means that you need to assess your business strategies, and adapt to the current situation. The key to success is the proximity to the customers! E...
by Fredrik Abildtrup
0

 
 

The Impact of Employee Performance on Customer Experience

In times of crisis, several companies need a good knock on the head to come back to the basics and stop the search for the guilty. Instead of running in circles, saying the vows of never again and speculating about the competit...
by Daniela Guido
0

 

 

Why customers love to complain

Complaining, complaining, complaining…more and more people are complaining about everything. Seth Godin states in his Blog that “we b*%§# and moan about a Facebook redesign, when it’s a free service; we can&#...
by Peter Niemeyer
0

 
 

Defining Your Customer Service Culture

Organizations go to great lengths to differentiate themselves from one another, usually at much expense. In reality, it is often their service culture that has one of the biggest impacts on customers and helps a customer decide...
by Bob Lucas
0

 
 
Untitled-15

4 Ways You Can Handle Customer Service Better

Problem solving for Customer Service Professionals is really based on the attitude that you WANT to help the Customer. Without this attitude, it becomes impossible to help in any way and in most cases ends up making the Custome...
by Leonard Buchholz
0

 




0 Comments


Be the first to comment!


Leave a Reply

Your email address will not be published. Required fields are marked *

*

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>

Anti-Spam Quiz: