The Customer Service Survey

Vocalabs' Blog

Blogs

One Picture that Captures the Essence of the CX Challenge

The challenge of trying to promote good Customer Experience practices can be summed up by one picture I ran across today. If you can't read the image, it's a sign from a clinic that reads, "You are not a number to us. Our goal is to ensure you have the best experience possible. Please take a number to help us serve you better."

So, um, yeah, about that...

CX is often in tension with other parts of an organization, which can make it challenging to go from saying the right things to doing the right things. No matter how much effort and research the Customer Experience team puts into creating a better experience, there's always someone else who thinks some other way will be better.

The result can be, like the sign, a jarringly obvious reminder that the organization doesn't really believe its own customer-friendly hype. The person who made that sign probably didn't see the obvious disconnect; chances are it seemed like a perfectly reasonable message for customers.

Sometimes it takes that customer's outside perspective to break through the company's internal blinders.

(By the way, I tried to find the original source of this picture and couldn't, but it has been posted in several other places. Here's a reverse image search if you want to see where it's been.

Spurious errors

I consider it something of a professional responsibility to take surveys when they're offered. I don't do every single survey (I tried that once a few years ago, and it wound up consuming way too much of my time), but I try to do them when I can.

A distressing theme is the number of bugs and errors I see in customer surveys. I haven't tracked it formally, but I would guess that at least one in ten online surveys I attempt is broken in some way. It's alarming how many companies are apparently unaware that their customer surveys don't work.

Today's example comes from MIcroCenter, where the survey form told me, "The errors below must be corrected before you can proceed." Which is all well and good if there had been any errors in the form, but there weren't.

So I guess MicroCenter won't benefit from my important perceptions and observations.

It's Your Duty to Complain

I talk a lot in this blog about how companies need to listen to their customers, collect fair and accurate feedback, and take action to improve. But do customers have a responsibility, too?

In a thought-provoking and well-reasoned article, Cory Doctorow argues that customers have a duty to complain, because that's how companies will know what's wrong and how to fix it.

Most people don't want to complain. Complaining is confrontational. Complaining risks being labeled a "complainer," like those crazy people everyone who works in retail or customer service know so well. And there are, unfortunately, a lot of companies where complaints tend to get lost in the machinery.

So rather than complain, many customers quietly take their business elsewhere, or suffer in silence if switching isn't an option.

As uncomfortable as it may be, complaining before taking your business elsewhere is almost always a better option for both the customer and the company. Complaining gives the company a chance to fix the problem, giving the customer better service and saving the company a customer. This is practically the definition of win-win.

In many cases companies will be grateful to get thoughtful and meaningful customer complaints. To make an effective complaint you should:

  1. Be calm. Remember that the person you are complaining to is another human being and is probably not personally responsible for your problem. Nobody likes dealing with angry customers.
  2. Be reasonable. You should ask the company to fix the problem and deliver the product or service you paid for, but a minor inconvenience does not usually entitle you to financial compensation (though many companies will do something to apologize). Don't make threats.
  3. Remember not everyone can fix your problem. Most people in any given company have only limited authority, and if your complaint is complex, the first person you complain to might not be able to help. If that's the case, ask to talk to someone who can help. If you're complaining about a general company policy, it may be a slow process (with lots of complaints from lots of customers) to make a change.
  4. Be prepared. Have the facts and your story straight. Be ready to explain in clear, concise terms exactly what happened, why you felt this was wrong, what impact it had on you, and what the company needs to do.

As a Customer Experience professional, I know that a rational and well-presented customer complaint is a valuable gift. The worst thing that can happen (from a CX perspective) is when a customer leaves without complaining and then tells friends (in person or through social media) about how terrible the experience was.

So while I wouldn't say that customers have a duty to complain, I do think everyone would be much better off if customers did it more often.

Issue #98 of Vocalabs' Newsletter

We just published the 98th edition of Quality Times, our newsletter about customer experience and customer feedback.

This newsletter is one of the ways we get to know prospective clients. So if you find this useful and informative, please help us out by forwarding this to other people who might also enjoy it, and encourage them to subscribe. You can subscribe to this newsletter on our website.

This month's newsletter is all about the "why" of collecting customer feedback. My first article talks about the tradeoff between getting positive feedback and honest feedback, and making sure you know which you prefer. The second article is about that too-often overlooked piece of survey design, understanding the goals of the program.

As always, I hope you find this useful and informative.

Sweating the Big Stuff

"Don't sweat the small stuff" as the old saying goes. Meaning: don't waste your time on things that are unimportant.

Unfortunately there's a lot of sweat-soaked small stuff in the world of customer feedback. Here are some things that often suck up a lot of time and effort relative to how important they are:

  • The exact wording of survey questions.
  • The graphic design and layout of reports and dashboards.
  • The precise details of how metrics are calculated (spending too much time on this is a warning sign that you might have fallen into the Metric-Centric Trap).
  • Deciding whether individual surveys should be thrown out (this is another warning sign of the Metric-Centric Trap).
  • Presenting survey trend data to senior leadership.

Instead, you should be focusing on some of these things that don't get nearly as much attention for the amount of benefit they bring:

  • Closing the loop with customers.
  • Providing immediate coaching of employees based on customer feedback.
  • Regularly updating the survey to keep it relevant to the business.
  • Doing root cause analysis of common customer problems.
  • Keeping senior leadership actively engaged in advancing a customer-centric culture.

The difference between the small stuff and the big stuff is that the small stuff tends to be (relatively) easy and feels important. The big stuff often requires sustained effort and commitment but pays off in the end. It's the difference between an exercise bicycle and a real bike: both take work, but only one will get you anywhere.

Vocalabs' Newsletter #97 is Published

We just published the latest edition of Quality Times, Vocalabs' newsletter about measuring customer service quality.

This month's newsletter features our hot-off-the-presses whitepaper, The Metric-Centric Trap, which I wrote in collaboration with Raj Sivasubramanian of eBay. I also write about the use of incentive-based pay in the customer experience world (spoiler: I'm not convinced it's a good idea).

This newsletter is one of the ways we get to know prospective clients. So if you find this useful and informative, please help us out by forwarding this to other people who might also enjoy it, and encourage them to subscribe. You can subscribe to this newsletter on our website to automatically get each new edition when it is published.

As always, I hope you find this useful and informative.

The Metric-Centric Trap

In my work I've found a lot of companies that genuinely want to improve their customer experience and invest a certain amount of time and resources into the effort. Often they go to great lengths to track and measure the voice of the customer, yet the customer experience never seems to improve much.

The problem is that these companies are investing their time and energy in relatively low-value activities like tracking survey metrics, and ignoring things that can move the needle like closed-loop feedback and actively training employees with the voice of the customer. It's an easy trap to fall into, since senor leadership teams often like to see numbers and metrics, but it can make it hard to really understand and improve the customer experience because the statistics obscure all the individual customers' stories.

I've had the great pleasure of collaborating with Raj Sivasubramanian, the Senior Manager of Global Customer Insights at eBay, to develop these ideas into a new whitepaper, The Metric-Centric Trap: Avoiding the Metric-Centric Trap on the Journey Towards Becoming Customer-Centric.

In this whitepaper we explore our experiences with companies that have wanted to be customer-centric but wound up metric-centric instead, discuss what's so bad about being metric-centric, and share some ways to break out of the trap and make more effective use of customer feedback.

You can download the whitepaper from our website.

Bruce Temkin on Qualitative Data

Bruce Temkin posted a short video recently making the case that we need more qualitative research in the customer experience world.

The problem is that boiling down customer feedback to statistical metrics obscures the fact that there are thousands of individual customers behind those numbers. Each customer has their own story and their own experience, and it's important to not lose those individual experiences in the analysis.

The way to not lose sight of customers is through qualitative data that captures their stories. That doesn't mean not calculating metrics. It means making sure you are including that qualitative feedback as part of the process and then using it.

Too often when companies collect qualitative feedback they either don't use it at all, or they try to boil down the customers' stories into statistics using analytics tools.

What needs to happen instead is someone needs to read or listen to individual customers' feedback to develop understanding for customers' emotions and experiences.

Statistics have their place of course. But statistics don't readily lead to empathy and understanding.

Do You Want Positive Feedback or Honest Feedback?

Which would you rather have: Positive customer feedback, or honest customer feedback?

Most people would probably say "both," but it's not always possible to have both. If the honest customer feedback isn't positive, then you can only get one or the other.

So when forced to choose, I would generally prefer honest feedback over positive feedback. As long as the person giving me feedback isn't being cruel or demeaning, I would rather hear about ways I can improve than get my ego stroked. At least that's what I say, and what the rational part of my mind thinks. In reality, hearing negative feedback can be hard, even though it's also much more valuable.

I think most business leaders would probably agree with me: it's better to get honest feedback from customers than positive feedback.

But the real-world incentives at most companies don't support this. Incentives based on customer feedback are always designed to encourage positive feedback, not honest feedback. That's because it's easy to measure how positive the customer feedback is, but almost impossible to measure how honest it is.

Where companies base bonuses and compensation on customer surveys, this leads to a perverse incentive to get customers to give higher scores no matter the customer's actual opinion. Is it any wonder that survey manipulation is so common?

The problem is that when customers don't give you the straight story, the feedback has no value. You might as well not do the survey at all if you can't get honest feedback.

So what's the solution?

The first step has to be to stop undermining yourself. If you give employees incentives to only deliver positive feedback, stop that.

That probably means you shouldn't be using survey scores for employee bonuses at all, since it's difficult-to-impossible to create an incentive system that encourages honest feedback.

The next steps are much harder. The goal is to create a culture where customer feedback is seen as constructive criticism and an opportunity to improve. That will require significant effort coaching employees in how to listen to customers without becoming defensive or negative.

But if your customer surveys are biased towards giving you positive, rather than honest, feedback, then you're not getting any value anyway.

Join the CXPA

I don't often plug other organizations in this blog, but I'm going to make an exception for the Customer Experience Professional's Association (CXPA). We have an amazing local chapter here in Minneapolis that holds meetings every other month, and the CXPA's annual conference is coming up in early May in Atlanta.

What makes CXPA different from other groups and events in this space is that CXPA is a non-profit professional association run by and for CX practitioners. That means that it stays focused on helping CX leaders connect with each other to share ideas and best practices. Since Customer Experience is much more an art than a science, finding out what's worked for others in the field can have tremendous value.

Any time someone asks me the best way to learn more about Customer Experience, I always tell them to join the CXPA and become active in their local and online communities. And now I'm telling you.

Vocalabs Newsletter: NCSS Results for Communications

We just published issue #96 of Vocalabs' Newsletter, Quality Times. This month we focus on our recent NCSS results for the Communications industry, showing that T-Mobile has continued its improvements since 2012, while Comcast still brings up the rear. The compete Executive Summary is available for download.

As usual, I hope you find our newsletter interesting and enjoyable. If you like it, please feel free to subscribe via email. Subscribers are the first to get new issues when they are published.

Incentives

A few weeks ago the Harvard Business Review published an article with the provocative title, Stop Paying Executives for Performance. The main thesis was, as you might expect, that companies should stop paying executives for performance. The authors give a number of reasons:

  • Performance-based pay only really improves performance for routine tasks, not activities which require creativity and out-of-the-box thinking.
  • The best performance tends to come when creative professionals focus on improving their skills rather than focusing on outcomes.
  • Intrinsic motivation is more powerful than extrinsic motivation, and performance-based pay tends to only drive extrinsic motivation.
  • Performance-based pay encourages people to cheat to hit bonus thresholds.
  • No performance measurement is perfect.

You can argue about whether the authors' conclusion is right (and in the comment section of the article several people argue that they are wrong), but there's no question they make a strong case for an interesting and counterintuitive idea.

To me as a Customer Experience professional, what's most interesting about this is that the same arguments can be made about performance-based pay for many customer-facing employees, since delivering a great customer experience often requires employees to go beyond the rote skills and find creative solutions to the customers' problems.

A great example of this is the declining use of Average Handle Time (AHT) in many call centers. Over the past ten years I've spoken with a lot of call center managers who have stopped evaluating (and compensating) their customer service reps based on how long they spend on the average call. Every single one has described significant improvements in the customer experience. Several also said that their AHT didn't even go up as a result.

Removing AHT from the employee's compensation meant that reps were freed from having to worry about the mechanical part of their job--how quickly they could get someone off the phone--and instead think about the best ways to solve the customer's problem. The other problems with performance-based pay also show up in customer service, since focusing on AHT can lead employees to cheat (for example, by occasionally "accidentally" hanging up on a customer at the beginning of a call). AHT turns out to not be as tightly tied to operational cost as most managers expect, because customers will often call multiple times if they don't get the service they need the first time.

Moving away from performance-based pay would be an enormous cultural change in today's business environment where the implicit assumption is that money is the best, or even only, way to motivate employees. But there's decades of research in behavioral economics showing that money is only a mediocre way to drive performance in many circumstances.

If you're using performance-based pay to try to improve your customer experience, it's worth thinking about whether you're really driving the behavior you want.

NCSS Results: T-Mobile Keeps Getting Better, Comcast Still Trails

in

We just published the 2015 results for the National Customer Service Survey (NCSS) of Communications companies. This ongoing survey tracks customer service quality at AT&T, CenturyLink, Comcast, DirecTV, Dish Network, Sprint, T-Mobile, Time Warner, and Verizon.

In this year's data we find that T-Mobile has extended its record of improving customer service going back to 2012, making substantial gains in many of our survey metrics and the company is now leading the pack in eight of the nine scores we track.

Back in 2012, T-Mobile was performing poorly on our survey. T-Mobile had just come out of a failed attempt to merge with AT&T (abandoned in December 2011) and its scores had been sinking. Since then, however, T-Mobile has rebranded itself as the "un-carrier" and made deliberate efforts to be more consumer-friendly. This has been successful, as shown by the fact that T-Mobile's moves to abandon such hated industry practices as two-year contracts and overage charges have now become the norm in the mobile phone industry.

Our data shows that T-Mobile's efforts have extended to improving its customer service more generally, with sustained multi-year improvements across the board in our customer-service metrics. While we don't have any insight into T-Mobile's internal operations, from our data it appears that the company has been making a significant and sustained effort to improve its customer service operations.

Comcast, meanwhile, has posted some small gains over the past two years (CenturyLink, Comcast, DirecTV, Dish, and Time Warner were added to our survey in 2013), but not enough to pull it out of last place in our survey rankings. In 2015 Comcast held the bottom slot in six of our nine metrics: better than 2013 when Comcast was last place in eight of nine metrics, but still not a stellar performance.

The complete survey data is available to NCSS subscribers, and the Executive Summary can be emailed to you through our website:

Download NCSS Communications 2015 Executive Summary

You're Not Managing If You Only Measure

"You can't manage what you don't measure" is the old business saw.

But measurement is only the first step. Once you have the data you need to put it into action.

A lot of customer feedback programs fail to go from measurement to management. They collect data and metrics but don't follow through with the actions needed to improve the customer experience.

The problem is that it's too easy to think that survey metrics are telling the whole story, and that improving them is the goal.

But neither of those statements are true. Survey metrics do not tell the whole story, they give a statistical aggregation of hundreds or thousands of unique customers each with their own experiences and problems. And improving survey scores should never be the goal, but rather an expected side-effect of the true goal of making thousands (or millions) of individual customers have better opinions about the company.

Using survey data creates a dilemma: on the one hand it's necessary to boil down the data into a handful of numbers in order to make sense of the thousands of individual customers' opinions and get a high level view of how you're performing. On the other hand, those statistical measurements also hide the individual customers' experiences and opinions, making it hard to understand what the numbers actually mean.

The solution is to being individual customers back into the management part of the equation. Survey scores are always going to be necessary to track progress, but when it comes to actually making changes you should focus on individual customers.

That means using feedback from individual customers (especially the open-ended parts of the survey) for coaching and training, spending time reading (or listening to) survey comments when looking at process changes or improvements, and making individual customers' stories (not statistics) the core of survey reporting.

So while it's true that you can't manage what you don't measure, you also have to be careful not to let the measurement get in the way of the management.

Vocalabs Newsletter #95

We just sent out the latest issue of Quality Times, our mostly-monthly newsletter about customer feedback and the customer experience. Featured in this month's newsletter are the National Customer Service Survey (NCSS) results for the banking industry in 2015. Next month we plan to publish the results for Communications, so stay tuned for those.

As always, I hope you find the newsletter useful and informative. You can subscribe to receive future issues via email.

NCSS Results: Chase on top, Bank of America Improves

in

We just published the results for the National Customer Service Survey (NCSS) on Banking for 2015. This is our ongoing syndicated research program comparing the customer service at four major consumer banks: Bank of America, Chase, Citi, and Wells Fargo.

For the NCSS we interview customers within a few minutes of a customer service call to one of the companies we track. This is very different from other published research, where participants are asked to recall a customer service experience which may have happened months ago. As a result, we are able to get very reliable, detailed survey data about what's actually happening when a customer picks up the phone.

In 2015 we saw Bank of America make significant improvements in our survey. In one year, BofA's score for Call Resolution went up 13 points, its score for Ease of Reaching and Agent went up 11 points, and overall satisfaction with the customer service call was up 13 points over the past two years.

Chase took the honors for best scores overall, even though it didn't have as dramatic an improvement as Bank of America. Chase had the highest scores in seven of the nine key metrics we track in our report, and generally continued the upward trajectory it has been on since we started our survey in 2011.

Meanwhile Citi took a beating, losing 13 points in overall satisfaction with the company, 12 points in satisfaction with the customer service call, and claiming the bottom slot in eight of the nine metrics.

The 2015 results represent a reversal for both Bank of America and Citi. When we started the survey in 2011, Chase and Citi were posting lower survey scores than Wells Fargo and Bank of America. But Chase and Citi made several years of improvement, while Bank of America's scores were generally flat. This year, though, Bank of America is back in the middle of the pack with its gains, and Citi's scores are behind its competitors.

It's hard to speculate on what might be driving these major changes this year. Improving the customer experience is a process, not a project, and it's possible that Citi has been distracted with other priorities.

You can get a copy of the Executive Summary sent to you through our website:

National Customer Service Survey on Banking, 2015 Executive Summary

Sharing Feedback, Constructively

Back when I was is college I took a creative writing class, and part of this class was to critique each other's work: read what the other students wrote, and offer constructive criticism and feedback. It turned out that was one of the hardest things to do effectively, because most people instinctively get defensive about any negative feedback. They're just too emotionally invested in their work to accept even mild criticism dispassionately.

The same thing can happen when you share negative customer feedback. Often, an employee's intense and emotional reaction is that someone is trying to tell them that they're bad at their job, and they react defensively. At the company level, most people take a lot of pride in the organization they work for (even when it's not justified) and have a hard time hearing that something might be broken. Breaking through this takes a lot of finesse and you have to be careful about how you present and frame the feedback.

I did eventually get pretty good at giving and receiving constructive criticism, and that's turned out to be really helpful professionally. Here's my suggestions for making negative feedback a positive experience:

  1. Most important, always have the attitude of constructive criticism. This is about problem solving, not assigning blame. A customer had a bad experience, that does not mean the company is bad at CX (even if you think they actually are, don't let that be part of the message). Everyone makes mistakes, and the goal is to identify the mistakes so they are less likely in the future.
  2. Present positive feedback along with negative, and lead with the positive. This helps set the tone of, "We're generally doing a good job and we'd like to find ways to do even better."
  3. Focus on the customer's perceptions. For example, if a customer complains about a late shipment, this should be framed as "A customer felt his delivery expectations were not met. Let's try to figure out why the customer felt this way," rather than, "We're really dropping the ball on deliveries!" There can be a lot of reasons for a negative perception, not all of them related to what actually happened.
  4. Select the feedback you choose to present carefully. Not all negative feedback is credible, but you should reinforce the customer feedback with other data that supports that this is a problem worth paying attention to (for example, "We're seeing more complaints about late deliveries this quarter. This customer's experience is similar to a lot of other complaints"). Share feedback that's articulate, believable, and relatable. Don't share the crazies, as entertaining as they may be.

Sharing customer feedback, both with individual employees and the organization as a whole, is a powerful way to motivate action but needs to be done carefully to inspire the right action and avoid negativity.

Newsletter #94 Is Published

We've published issue #94 of Quality Times, our newsletter about customer experience and customer feedback programs. 

This month, rather than doing the usual thing of writing about the industry trends for the new year, I wrote about the Non-Trends. These are the basic truths of Customer Experience work which were true last year, will be true this year, and will still be true in 2017. Most of these are much more important than the hot trends for 2016.

This newsletter is one of the ways we get to know prospective clients. So if you find this useful and informative, please help us out by forwarding this to other people who might also enjoy it, and encourage them to subscribe. You can subscribe to this newsletter on our website.

Goodhart's Law

In the field of macroeconomics, Goodhart's Law states that "When a measure becomes a target, it ceases to be a good measure."

As an economic theory this is the rough equivalent of Murphy's Law, though with a kernel of deep truth at the core. Macroeconomic measurements distil an enormously complex system down into a handful of simple numbers that require considerable effort to measure. For example, in the mid-20th century in the United States, we had a problem with inflation. Low inflation is desirable because it tends to correlate with economic stability and predictability and encourages the middle class to save and invest for the future. But when policymakers initially tried to slow inflation through wage and price controls rather than addressing the underlying problems in the economy, the result was an unbalanced economy and (eventually) the stagflation of the 1970's. Of course this is a grossly oversimplified summary of 20th century economic history, but the point is that by trying to force inflation to hit a target, the inflation rate stopped being a good proxy for economic stability and predictability.

Goodhart's Law in Customer Experience

Goodhart's Law applies in the world of Customer Experience, too.

Most of the core metrics in any CX effort (for example, survey scores like Net Promoter or Customer Satisfaction; or internal metrics like Delivery Time) are used because they are strongly correlated with customers' future behavior, positive word-of-mouth, and long-term growth of the company.

But if you try to turn a CX metric into a target, it may no longer be useful as a measure of the customer experience. That's because the things you really want to change (customers' future purchases, positive word-of-mouth, long-term growth, etc.) are the result of many complex interactions inside the company and between the company and its customers. And its often easier to hit a goal by gaming the system than it is to fix the underlying problems.

For example, in the case of ABRA Auto Body I blogged about a couple days ago, the company almost certainly did not set out to create a survey which would yield inflated, meaningless scores. Instead, they most likely determined that high survey scores were often strongly correlated to repeat business and new customers through recommendations.

But rather than explore the root causes of high (or low) customer satisfaction and address those, the company probably decided to simply give managers an incentive to hit a certain survey score and let them figure out how to do it.

The result is that it's much easier for a manager to print off a bunch of fliers instructing customers on how to answer the survey, than it is for them to think about how the customer journey might be improved. (It's possible that ABRA doesn't even give managers the authority or budget to change the things that might matter, in which case the manager may have no choice but to try to game the survey.)

The lesson should be obvious: If you want your CX metrics to be useful measurements of your customer experience, then you need to be very wary of how incentives invite manipulation.

ABRA Is Not Subtle About Survey Manipulation

Fellow CX professional Jason Kapel told me about a recent experience at ABRA Auto Body. He had his car repaired, and while the experience itself was fine, he found the flier to the right attached to his receipt.

If there was a Hall of Fame for the Hall of Shame, this would have a good chance of winning Most Blatant Example of Survey Manipulation. Not only does it tell the customer exactly how they want each question answered, at the bottom of the flier it instructs the customer not to take the survey if there was some problem with the repair.

Needless to say, this survey is not likely to get much honest and unbiased feedback from customers, nor is it going to identify ways to improve the customer experience. Pretty much the only thing this survey will do is allow the manager to boast about his great survey scores and claim whatever reward (or avoid whatever punishment) results from hitting his numbers.

All of which begs the question, what's the point of doing this survey?

I have to assume that either ABRA is unaware that their survey is being blatantly manipulated, or they don't care. Neither possibility speaks well to the level of commitment and attention the company is paying to improve their customer experience.

Customer Experience Non-Trends for 2016

It's the beginning of a new year, which means it's time for pundits and prognosticators to pull out their crystal balls and make predictions about the twelve months to come.

Bruce Temkin, for example, has published his 11 Customer Experience Trends for 2016 (why 11? Presumably because it's one better than ten). He has identified such things as Journey Designing, Empathy Training, and Predictive Analytics as areas to watch, and declared that 2016 will be The Year of Emotion.

Who am I to disagree?

But in my view, such trend articles miss the bigger picture, which is that the important facts of the Customer Experience profession will be pretty much the same in 2016 as they were in 2015 and earlier years. These are the non-trends, the things that don't change, and most of them are more important than the trends.

So here I present my Customer Experience Non-Trends for 2016. Not only are most of these non-trends more important to the average CX professional than the Trends, you can read these safe in the knowledge that in January 2017 I can just republish the same article with a different date.

Non-Trend 1: Engaged Leadership Is The Single Most Important Element in CX

The companies delivering a great customer experience almost always have leadership actively engaged in continuously trying to deliver a better experience. Conversely, companies where leadership views CX as a one-time project, or something to delegate, generally don't succeed in delivering a superior experience.

The lesson here is simple: if you want to improve the customer experience in your organization, the most important thing you can do is get the senior leadership to care and make it a personal priority.

Non-Trend 2: Great CX Is About Getting a Thousand Things Right

Sweat the details. A grand strategy or a new piece of technology will not, by themselves, move the needle on your customer experience (though the right strategy and tools definitely make the job easier).

Unfortunately, "sweat the details" is not a sexy message and it doesn't help sell software and services. Many vendors make the empty promise that their solution will, by itself, transform your CX effort. Don't believe it. There is no magic bullet.

Non-Trend 3: Customer Experience Professionals Often Have a Tough Job

The field of Customer Experience has made great strides over the last decade or so, but it's still not easy. We've finally gotten to the point where most companies will at least say that the Customer Experience is a priority, but many of them have yet to internalize it. The leadership doesn't yet care enough to dedicate the needed resources, or they think that because they have a CX team the problem is solved and they can mostly ignore it.

So in a lot of places, the role of the CX professional will continue to revolve around getting leadership attention, finding the easy wins, and internal evangelism. This, unfortunately, is not likely to change any time soon.

Non-Trend 4: Great CX Drives Customer and Employee Passion, Which Creates Better CX

The sweet spot of customer experience is when your whole organization is focused on creating a better experience for customers, which makes customers want to do more business with you, and that makes employees want to help customers even more. Customer Experience becomes a positive feedback loop.

The unacknowledged truth is that most employees genuinely want to do a good job and have a positive impact on their customers. It's one of the most satisfying things we can do in our careers. A strong focus on CX creates not just more satisfied customers but also more satisfied employees.

Here's hoping for a terrific 2016!

What Are Your Goals?

Before you get into the nuts and bolts of designing a survey program, spend some time sharpening up what you hope to accomplish. A good understanding of the business goals of the survey will really help figure out the right sampling, questions, channel, and reporting. A lot of the time when I hear companies say they want to do a survey for the purpose of collecting customer feedback, it really means that they haven't thought a lot about what they plan to do with the feedback once it's collected. It's like saying you want to do a survey for the purpose of conducting a survey.

The basic ingredients are straightforward. Most surveys have as their goals some combination of:

  • Tracking metrics: Requires using a very consistent set of survey questions with a random sample selected to give an acceptable margin of error for calculating metrics. 
  • Improving the performance of individual employees: Requires targeting the survey sample to collect adequate feedback on each individual employee, asking open-ended questions about the experience, and delivering the feedback to front-line supervisors in real time. Recorded customer interviews are particularly valuable.
  • Identifying customer pain points: Requires a lot of open-ended questions and potentially additional follow-ups. Customers should be invited to tell their stories.
  • Testing or validating changes to the customer experience: Requires careful attention to test and control group samples, and a consistent set of metrics for the different test cases (see A/B Testing for Customer Experience).
  • Persuading the organization/leadership to make a change to the customer experience: Requires collecting a valid statistical sample that supports the proposed change, as well as persuasive customer stories which will carry emotional weight with others in the organization. Recorded customer interviews are particularly valuable.
  • Providing individual customers a chance to be heard: Requires offering the survey very broadly, even if that means a low response rate or far more completed surveys than would otherwise be needed. A robust closed-loop process is not optional

So for example, if you've never done any transactional feedback before, your goal is probably going to be mostly about identifying customer pain points (i.e. trying to find out what you don't know) with a dash of tracking metrics thrown in. That probably means asking a couple of tracking questions and a lot of open-ended questions, and a random sample in the range of 400 completed surveys per reporting segment (enough to get a 5-point margin of error).

But if your goal is more directed to improving employee performance, things will be different. You will want to bias the survey sample to ensure each employee gets enough feedback to be useful (which also means un-biasing the sample to calculate metrics). You will probably also want to use customer interviews rather than automated surveys, since a recorded interview with the customer is much more effective at changing behavior than written comments and statistics.

Whatever your goals are, the most important thing is to have them. Surveys done for the sake of doing surveys tend to not be very useful.

Customer Survey Mistakes Almost All Companies Make

It's easy to do a survey, but it's hard to run an effective customer feedback program that leads to changes in a company's actions and improved customer experience. There are a number of common mistakes: so common that nearly all companies make at least one of these mistakes, and a lot of companies manage to hit the entire list:

Not Understanding the Purpose of the Customer Survey

If you don't know what you expect to accomplish through a customer feedback program, it's hard to structure it in a way that will meet your goals. For example, a survey designed to help improve the performance of customer-facing employees will be very different than one merely intended to track metrics. When I ask companies why they are running a survey, often I hear answers like, "To collect customer feedback," or "Because it's a best practice." Answers like that tell me that they don't have a clear sense of why they need a survey, other than for the sake of having a survey.

Asking Too Many Questions

Long surveys generally have a poorer response rate than shorter surveys, can leave the customer with a bad feeling about the survey, and often don't produce any more useful feedback than shorter surveys. In many cases, there is no good reason to ask a lot of questions, other than a need to appease a large group of internal stakeholders each of whom is overly attached to his or her favorite question or metric. It's easy to find the questions you don't need on your survey: go through all the questions and ask yourself, "Have we ever actually taken any action based on this question?" If the answer is no, the question should go.

Focusing on Metrics, Not Customers

Metrics are easy to fit into a numbers-driven business culture, but metrics are not customers. At best, metrics are grossly oversimplified measurements of your aggregate performance across thousands (or millions) of customer interactions. But behind those numbers are thousands (or millions) of actual human beings, each of whom had their own experience. Many companies focus solely on the metrics and forget the customers behind them. Metrics make sense as a progress marker, but the goal is not to improve metrics but to improve customer experiences.

Not Pushing Useful Data to the Front Lines Fast Enough

In many cases, creating a great customer experience isn't about installing the right platform or systems, it's making sure that thousands of little decisions all across the company are made the right way. Those people making those decisions need to know how their individual performance is helping contribute to the overall customer experience, and the best way to do that is give them access to immediate, impactful feedback from customers. Too often, though, customer feedback gets filtered through a centralized reporting team, or boiled down to dry statistics, or delivered in a way that masks the individual employee's contribution to the whole.

Not Closing the Loop

Closed-loop feedback is one of the most powerful tools for making sure a customer survey inspires action in the company, yet even today most companies do not have a formal system in place to close the loop with customers. There are actually three loops that need to be closed: you need to close the loop with the customer, with the business, and with the survey. If you're not closing all three loops, then your survey is not providing the value you should be expecting.

Always Using the Same Survey

Companies change and evolve. Markets shift. Customer's expectations are not static. Entire industries transform themselves in just a few years. So why do so many customer surveys remain unchanged for years (or decades)? Surveys should be structured to respond to changing business needs and evolve over time, otherwise you're not collecting feedback that's relevant to current business problems. Surveys that never change quickly become irrelevant.

Not Appreciating Customers' For Their Feedback

Finally, a lot of companies forget that when they do a survey they are asking a customer--a human being--to take time out of their day to help out. And they're asking for hundreds or thousands of these favors on an ongoing basis. But when the reports come out and the statistics are compiled, all those individual bits of human helpfulness are lost in the data machine. I know it's not practical to individually and personally thank thousands of customers for doing a survey, but it's not that hard to let customers know that you're listening to them and taking their feedback seriously. All too often the customer experience of completing a survey involves taking several minutes to answer a lot of questions and provide thoughtful feedback, and then it disappears into a black hole. You don't need to pay customers for taking a survey (in fact, that's often a bad idea), but you should at least stop and think about how helpful your customers are being and appreciate their efforts.

Issue #93 of Quality Times is Published

We just published the 93rd issue of Quality Times, our newsletter about measuring the customer experience. Email subscribers should be receiving their copies shortly, and you can read it on our website.
This month's theme is making sure you're putting your customer service efforts in the right places. Our first article is about how collecting more data isn't always a useful activity if it isn't the right data. Then we have an article about which customer experience efforts actually make a difference and why so many companies seem to focus on the low-value ones.
As always, I hope you find this useful and informative.

Treating Customers as Human

In a nice counterpoint to AT&T's "cease and desist" approach to customer suggestionsConsumerist has the aww-cute story of Delta's response to an 8-year-old's mailed suggestion. Rather than trot out the lawyers, Delta sent a friendly, personalized letter from an executive along with some Delta swag.

So instead of annoying a loyal customer and generating a slug of bad PR the way AT&T did, Delta gets some goodwill with a future customer and his entire family and just the sort of heartwarming spirit-of-Christmas story the media love to run this time of year.

All because Delta chose to respond to a customer's suggestion at a human level rather than as a legal threat to be promptly squashed.

I think the lesson is clear.

Syndicate content