Most customer feedback surveys today are done either automated online surveys or live telephone interviews. There's still a few automated phone surveys (IVR) and paper surveys, but those seem to be disappearing quickly as they tend to be at a disadvantage for both cost and quality.
In general, online surveys provide low cost per response, while phone interviews are substantially more expensive but have a much higher response rate and provide higher quality data and richer qualitative feedback.
Combining these two approaches can give you the best of both worlds: a large number of survey responses at a reasonable cost, plus a more representative sample and more detailed customer comments.
Here's some scenarios where you should consider a hybrid online survey and phone interview:
There are also some situations where it probably makes sense to stick with a single survey channel:
Designing and running a hybrid survey program may sound like something that requires Ninja-level Customer Experience skills, but the reality is that it's not that much more complicated than any other ongoing customer feedback process. In the right circumstances and executed well, a hybrid survey can give you far more value for the dollar than any other survey strategy.
Next week our local chapter of CXPA will be hosting a session called "Battle of the Metrics." I'm looking forward to it: it should be an informative and (I hope) entertaining meeting. If there's one thing that can spark a lively discussion among Customer Experience professionals, it's someone who takes a strong stand for or against any particular metric.
But why do we spend so much time and effort worrying about metrics?
Most reasonable CX metrics provide directionally similar results: when Customer Satisfaction or Net Promoter improve, chances are very good that Customer Effort, Customer Loyalty, or any scorecard composed of customer survey responses will also improve. The numbers will be different, but they should all tell a similar story. Viewed in that way, arguing about which metric is best is a little like arguing about whether miles or kilometers are better.
Though come to think of it, when the United States tried unsuccessfully to go metric a half century ago, it turned out that a lot of people suddenly felt very strongly about whether to measure highways in miles or kilometers. So maybe it's not so surprising that we also have strong feelings about which CX metric to use.
When used properly, it shouldn't matter all that much which metric we choose. Most of the real CX action is below the level of the metrics: it's about finding ways to improve individual customer journeys, most often by helping people at all levels of the organization put themselves in those customers' shoes. Metrics, like signposts on the highway, give us some sense of how far we've gone and whether we're moving in the right direction. Miles or kilometers, either one will tell us that we're making progress.
And to the extent that different metrics give us different results, that's a sign that something unexpected is happening and we need to pay attention. Because while different CX metrics usually move together, they do measure somewhat different things. So if Net Promoter (which measures the strength of a customer's overall relationship) improves while Customer Effort (which measures how smoothly a particular transaction went) is getting worse, that could be a sign that something's afoot. It may be that there are some operational problems which your customers are willing to forgive (for now); or it may be that you are benefitting from a competitor's misstep. Whatever the situation, it's worth spending some effort to dig deeper.
In the end, I think metrics appeal to us because they give us a simple view into a complex reality. Boiling down our CX efforts to one number makes it easier to explain the impact of Customer Experience, and it makes it easier to show leadership what exactly it is that we're trying to achieve.
This is fine, but it comes with a steep price. Because in the end, it's not the metric that matters. It's everything that goes into the metric, all those thousands or millions of individual customers and their individual stories that matter. The metric, while it makes it possible to think about the bigger picture, conceals far more than it reveals.
We recently released our 2017 report for the National Customer Service Survey on banking customer service, tracking customer service performance at Bank of America, Chase, Citi, and Wells Fargo. An executive summary is available for download.
In our 2017 data we find that Citi has rebounded significantly from its decline over the past two years, and is now generally scoring in-line with industry peers in our survey. Bank of America has declined meaningfully since 2015, and in 2017 had the lowest scores in seven of our nine key metrics. Wells Fargo has the highest scores in our three Business Outcome metrics.
This unique independent research is underwritten and conducted by Vocalabs on an ongoing basis to benchmark industry trends in phone-based customer service. We perform in-depth interviews with customers of each company immediately after a customer service call. High-level business metrics are correlated to specific events during the customer service call, allowing us to directly measure the drivers of customer satisfaction and loyalty.
This report is based on over 5,000 telephone interviews conducted between 2011 and December 2017. The National Customer Service Survey is an ongoing research study, and we are collecting new customer feedback about each company continually throughout the year. We publish an annual Executive Summary highlighting key trends, and Vocalabs clients have access to more in-depth analysis and can develop proprietary benchmarks against our syndicated research. Please contact us for more information.
One of the reasons to invest in improved Customer Experience is the positive effects it will have on your company's reputation and word-of-mouth.
That's great and well-deserved for companies that truly internalize CX. But I've seen a few companies where they treat their Customer Experience as a marketing campaign, and it never ends well.
This has been on my mind lately because Comcast, everyone's favorite CX bad boy, has been making noise lately about how they're mending their ways. They even had their EVP of Customer Service in the cable division, Tom Karinshak, do a puff-piece interview for a customer experience podcast.
But I wonder if this is a true conversion, because while they're saying all the right things it isn't clear to me that any of the root causes of Comcast's reputation have changed. For example, Comcast is still an effective monopoly in most of its markets and doesn't seem to have much of an incentive to care.
I'm not the only one to have this reaction. Jim Tincher noticed some recent fine-print changes on Comcast's website, and his take is that Comcast still cares more about maximally monetizing its subscribers than building relationships with them.
I saw a similar dynamic play out at Sprint almost a decade ago. Sprint, like Comcast today, was known for bottom of the barrel customer service. Sprint invested heavily in improving its customer service, and heavily promoted research (including Vocalabs' research) showing a positive effect. And then Sprint's attention turned elsewhere and the service went right back to where it had been.
This nicely encapsulates the difference between internalizing Customer Experience and treating it like a marketing campaign. When you do CX right, it becomes part of the core fiber of the company. It's hard, and it requires ongoing effort, but the positive benefits are long-lasting and build over time. But if it's just a PR initiative, once the campaign is over things will go right back to the way they were. There might not even be time for the company's reputation to improve in any meaningful way before the old bad habits settle in again. Worse, management may conclude that Customer Experience doesn't pay off because they didn't see any sustained benefit. That will make it a harder sell the next time around.
Companies which are CX leaders understand that Customer Experience isn't something you do, it's something you are. Companies which invest in CX looking for good PR and short-term financial gain are likely to fail on all counts.
It's the beginning of a new year, which means it's time for pundits and prognosticators to pull out their crystal balls and make predictions about the twelve months to come.
Bruce Temkin, for example, has declared that the Customer Experience Theme for 2018 is "Humanity".
Who am I to disagree?
But in my view, such trend articles miss the bigger picture, which is that the important facts of the Customer Experience profession will be pretty much the same in 2018 as they were in 2017, 2016, and earlier years. These are the non-trends, the things that don't change, and most of them are more important than the trends.
So here I present my Customer Experience Non-Trends for 2018. Not only are most of these non-trends more important to the average CX professional than the Trends, you can read these safe in the knowledge that in January 2019 I can just republish the same article with a different date, just as this year's article is the same as my 2017 Non-Trends article, which in turn is the same as my 2016 Non-Trends article.
The companies delivering a great customer experience almost always have leadership actively engaged in continuously trying to deliver a better experience. Conversely, companies where leadership views CX as a one-time project, or something to delegate, generally don't succeed in delivering a superior experience.
The lesson here is simple: if you want to improve the customer experience in your organization, the most important thing you can do is get the senior leadership to care and make it a personal priority.
Sweat the details. A grand strategy or a new piece of technology will not, by itself, move the needle on your customer experience (though the right strategy and tools definitely make the job easier).
Unfortunately, "sweat the details" is not a sexy message and it doesn't help sell software and services. Many vendors make the empty promise that their solution will transform your CX effort. Don't believe it. There is no magic bullet.
The field of Customer Experience has made great strides over the last decade or so, but it's still not easy. We've finally gotten to the point where most companies will at least say that the Customer Experience is a priority, but many of them have yet to internalize it. The leadership doesn't yet care enough to dedicate the needed resources, or they think that because they have a CX team the problem is solved and they can mostly ignore it.
So in a lot of places, the role of the CX professional will continue to revolve around getting leadership attention, finding the easy wins, and internal evangelism. This, unfortunately, is not likely to change any time soon.
The sweet spot of customer experience is when your whole organization is focused on creating a better experience for customers, which makes customers want to do more business with you, and that makes employees want to help customers even more. Customer Experience becomes a positive feedback loop.
The unacknowledged truth is that most employees genuinely want to do a good job and have a positive impact on their customers. It's one of the most satisfying things we can do in our careers. A strong focus on CX creates not just more satisfied customers but also more satisfied employees.
Here's hoping for a terrific
2016 2017 2018!
Not everyone is going to immediately spot the problems I saw with this online survey I got from Discover Card this week. But some people will, especially those who have some knowledge of User Interface design.
When designing this survey someone thought it would be cute to have the selection buttons shade from red to green. I have no idea where this idea came from, but it seems like the sort of thing that might come from a graphic designer and get put in place without bothering to consult anyone who understands user interface or survey design.
The first problem that jumped out at me when I saw this survey is that the grey in the middle for "Neutral" makes it look like the "Neutral" button is disabled (in the screen shot, "Neutral" is selected, which is why its circle is filled in rather than open). It's become a standard part of user interface design to indicate that a control is disabled by greying it out, so at first glance some users might think that Neutral isn't actually an allowed option on this survey.
That's something that could affect the outcome of the survey at the margins. Does it? I have no idea--and I'm guessing that Discover Card didn't calibrate the survey to see if their color choices make a difference. But it's certainly plausible, which is one reason survey design is hard. So many things can affect the results of a survey that you need to be careful to either understand the design choices, or make sure that the analysis and decision-making process is robust in the face of subtle survey biases.
There was another problem I immediately spotted with this survey, one which most people won't notice but which 7%-10% of the male population will immediately see (or, in this case, not see). The particular shades of red and green used in this survey are ones which are hard to distinguish for people with the most common form of color blindness. So for me, and a significant minority of the population, whatever Discover meant to communicate through the colors of the buttons is completely lost because we can't easily tell the difference. Since there are color palettes out there designed to be accessible to colorblind people, and this is another important detail that good User Interface designers know to watch out for, I am again led to the conclusion that Discover didn't really think about their color scheme very carefully before dropping it into their survey.
(Those with normal color vision will probably be surprised to read that before I wrote this article, I actually used Photoshop to verify that the colors in the image really are red and green. It would be embarrassing to get such an important detail wrong.)
I sometimes like to say that survey design isn't hard, what's hard is dealing with the people who think it's easy. It's not hard to design a reasonable survey, but there are a number of details which can change the outcome. But because designing a survey looks easy, often people will want to make changes without thinking through the implications. This survey is a great example of a seemingly-trivial design choice which might actually impact the data, and which clearly isn't necessary.
If you are one of the lucky few who plunked down the equivalent of a four-year degree to buy a Tesla Model X, you probably already know about your car's special holiday light show.
For those who haven't heard about this before, here's the video. Model X owners can press a special sequence of buttons to get their car to flash its lights and open and close the doors in time to holiday music. It's like one of those over-the-top Christmas light displays, except for your car.
Most car owners--and I'm going to guess most auto industry executives--will probably think, "Why the heck does anyone need their car to perform a musical light show?"
But evidently that's not what Elon Musk thought. Because at Tesla, it's apparently completely rational to invest a meaningful amount of design and engineering resources in making the car do something completely useless and utterly frivolous just because it's cool. We know this because the light show isn't the only wacky feature programmed into Teslas: there's a whole web site devoted to Tesla's easter eggs.
There's a lesson here about how the customer experience plays into a company's brand. Most companies don't spend much time and effort being playful or frivolous. But for most people, that time spent having fun, doing things just because, is very important. It's what we enjoy most, it's what we wish we had more of, and in some ways it's what makes us human. Play is even recognized as an important part of mental health.
So when a company shows that it, too, can be playful, it helps humanize the company. It shows that the business isn't just a giant faceless bureaucracy, but rather people with personality. And that makes us like the company more.
There's a lot of things you can do with your customer feedback program to help improve Customer Experience. There's also a lot of things you can do that don't generally improve CX at all.
But there's one piece of the feedback program that stands far above everything else in effectiveness.
If you want to use your customer feedback to improve CX, the first thing you should do is deliver qualitative feedback to customer-facing employees as fast as possible.
It's the customer's comments and suggestions that actually help your frontline employees understand what customers expect and how to deliver it. Other deliverables from a feedback program don't provide this:
The unfortunate and ironic truth is that there are a lot of customer survey programs where delivering customer comments and suggestions directly to low-level employees is an afterthought, or not even part of the program. This typically happens in large organizations with highly structured and formalized customer surveys, usually designed solely around leaderships' desire to measure performance.
Of course, getting customer feedback into the hands of employees is not the only thing you can or should do to build an effective survey. You should also have a robust closed-loop process, coach employees on how to use the customer feedback, regularly update the survey program to meet evolving business goals, and so forth.
But if you want to gauge how effective a feedback program is at actually moving the needle, the first question to ask is, "If a customer leaves a survey comment, how long does it take before the employees who worked with that customer get that feedback?" In many cases, the answer to that question will tell you all you need to know.
I started out writing a list of the top-ten mistakes companies make when designing and executing their customer feedback programs, but found I had a hard time stopping at just ten. To paraphrase an old saying, most successful VoC programs are pretty much the same, but each failed program fails in its own special way. And there's a lot of VoC programs out there that fail to deliver value, fail to meet the program goals, and fail to do much of anything other than suck up a lot of time and resources.
So having failed to write a top-ten list, I present to you instead my top 25 mistakes in VoC programs. I could have probably done a top-100 list, but nobody would have kept reading that far.
The Minneapolis Chapter Meeting of the CXPA featured a panel discussion this week for Customer Experience Day. Four Customer Experience luminaries from the Twin Cities area fielded questions from a packed audience for the better part of an hour, but the very last question stood out.
"What is the most important Customer Experience metric?"
This prompted chin scratching and discussion of the relative merits of common survey metrics like NPS and Customer Effort, and general consensus that no one metric is ever going to give the whole picture, as well as the important fact that if you're focusing on finding the right metric then you're probably doing CX wrong.
I was not part of the panel (I'm not nearly luminous enough), but if I had been, my answer would have been different. Because I believe there is one metric that stands out above all others in measuring the progress of a company's Customer Experience efforts and predicting future success in harnessing all the financial and market benefits of being a customer-centric organization.
My metric, the One Metric to Rule Them All, is simple: The amount of attention a company's C-Suite leadership pays to Customer Experience.
I admit that I haven't actually tested this metric in the real world. Nor do I know anyone else who has--though I will cheerfully buy a beer for anyone cheeky enough to suggest performing a time-tracking study on their company's C-Suite executives.
But everyone I talked to agreed that a measurement of senior leadership attention is likely to outperform NPS, CSAT, Customer Effort, and just about any other customer-facing metric you might care to devise. Leadership focus on Customer Experience is the most critical element of a successful CX program: if you've got the C-Suite pulling for CX, everything else tends to fall in place. But if the leadership is indifferent, then the whole program is going to be an uphill struggle.
The other piece of this is the leadership needs to be directly paying attention, and not just spending money and delegating CX to a team. In most large organizations attention is more scarce than money, and it quickly becomes apparent what the company actually cares about and what they merely think they should be doing.
So if you want to gauge the success of your CX program, there are many survey metrics you can use. But the truest measure will be to look inside and see how much time and attention you're getting from the most senior leadership.
Let us put our expertise in customer feedback to work for you.