We've published the 2016 results of the National Customer Service Survey on Banking. This ongoing research tracks the customer service quality at Bank of America, Chase, Citi, and Wells Fargo in a continuous data set going back to 2011.
The most striking change in our survey scores in 2016 is the continued decline in Citi's metrics, which have declined substantially over the past two years. Citi now ranks last in eight of the nine survey scores we track.
We don't know why Citi has seen such a drop in our survey, but we speculate that it might be related to Citi's partnership with Costco. It's been widely reported in the media that onboarding the Costco customers took longer and was more of a challenge than had been expected. If Citi had been diverting resources from its regular customer service to manage that transition, that would explain the results we're seeing. If that's the case, we would also expect to see Citi's numbers rebound now that the Costco transition is complete.
The other interesting result this year is a case of something we didn't see: We saw no decline in survey scores at Wells Fargo despite the company's fraudulent account scandal. We think this is because of two factors: first, news of the fraudulent accounts broke midway through 2016, meaning that about half our survey sample was collected before it became widely known. Second, our survey is intended to measure the quality of customer service and not reputation. Unless a specific Wells Fargo customer in our survey had been personally impacted by the company's fraud, he or she isn't likely to rate the customer service any lower as a result of the news.
The executive summary of this report is available for free download, and the full data including customer comments and interview recordings is available by subscription.
Language Log has coined the word "Nerdview" to describe the common situation of writing in technical terms for a non-technical audience. Nerdviews are all around us, to the point where we hardly even notice them anymore.
For example, asking you to input the "CVV2" on your credit card to process an online payment is a classic nerdview, since unless you are in the payment industry you would have no reason to know what a CVV2 is or how to find it. This is so common that most people have figured out that the CVV2 is the three-digit security code on the back of a credit card--even though most cards don't label the code "CVV2" or give any other indication of what it's for.
Nerdviews almost always lead to a worse customer experience, and should be avoided whenever possible. But that can be a challenge because oftentimes the people who have to decide how to communicate important information to the general public are the same ones who are experts in their own narrow field. It can be hard to step outside your expertise and think like a novice.
My personal favorite nerdview was one I encountered many years ago at Vocalabs. One of our clients was an insurance company, and in their automated customer service system one of the prompts asked, "Do you want to know your withdrawal value or your redemption value?" Unless you are in the insurance business, chances are you don't know that a life insurance policy can have two different values, much less what they mean.
Finding and avoiding nerdviews is almost always worthwhile, but it can be a challenge. Nerdviews tend to become invisible to us once we figure out what they mean. Humans are adaptable, and even if you were mystified the first time a government form asked you for your "DOB", chances are the next time you remembered that it meant "Date of Birth" and hardly even noticed.
Here are some tips for uncovering nerdviews:
And remember that even if you don't think you're a nerd, we are all nerds about something. Remember your audience, and stay away from the nerdview.
Imagine taking a college class, and at the beginning of the semester the Professor announces, "For this class, we're not going to be handing back any of your papers or exams, and we won't tell you any of your grades on individual assignments and tests. The only grade you'll get is your final grade at the end of the semester which will be an average of all your work."
You wouldn't expect to learn much in this class. In order to improve, you would want to know what you were doing well at and where you needed to improve throughout the semester. You would want specific feedback about specific things you had done.
And yet many customer feedback programs are structured just like this insane Professor's class. Somehow we expect employees to know how to improve despite only getting an average survey score every month or every quarter.
In order to make a survey program helpful, we need to give people the chance to connect specific customer feedback to specific things the employee did to garner that feedback. We also need to help employees think about the feedback as constructive criticism so they have the tools to apply the feedback to their daily customer interactions.
Here are some tips to help make this happen:
There are, of course, real concerns about managing the delivery of customer feedback to employees. But the solution is better coaching and supervision, not giving people so little feedback that it becomes useless.
If you're thinking about using the Net Promoter Score (NPS) on a business-to-business survey, there's some extra factors you should consider before committing to this metric.
Net Promoter Score is a common, and somewhat controversial, measurement of a customer's relationship to a company. It's based on responses to the question, "How likely are you to recommend us to a friend or colleague?", and is calculated by subtracting the percentage of low responses from the percentage of high responses. It has the advantage of being a simple and standardized metric, but also gets heavily criticized for being too simplistic and oversold. NPS also tends to get shoehorned into situations where it doesn't make any sense, an unfortunate side-effect of being heavily (and wrongly, in my opinion) promoted as "the only survey score that matters."
The Net Promoter question asks the customer to imagine a scenario where they might be asked to make a recommendation about some product or service. The idea is that telling a friend to buy from a particular company is something most people will only do if they feel strongly about that company. Customers willing to make that commitment are valuable and potentially a source of word-of-mouth marketing (in the jargon of NPS, they are "Promoters"); whereas customers who feel the opposite can actively damage a brand (these are called "Detractors"). Subtracting the promoters from detractors gives you a single simple number, while emphasizing the fact that you want to have a lot more Promoters than Detractors.
In the Business-to-Consumer world this all seems sensible. But in the Business-to-Business world there's additional complexity.
A number of common situations challenge the validity of NPS in the B2B world:
In short, NPS is already simplistic in the B2C world, and trying to apply it to more complex B2B relationships is a challenge.
This doesn't mean that NPS is useless in the B2B world. Fortunately, we've found that in the real world most people don't answer the Net Promoter question literally (though some do)--as demonstrated in the story from a few years ago about what happens when you try to get Promoters to actually recommend the product to a friend.
Instead, most people answer the Net Promoter question in a more generic way, providing a high level view of how they feel about the company overall. The question is being interpreted as something closer to, "How would you rate the company?" This is confirmed by the fact that on surveys where we ask both a Net Promoter question and a more generic Customer Satisfaction question, the answers tend to correlate almost perfectly.
This means we can get away with using Net Promoter even in situations where it might not strictly apply, because most customers will know what we're trying to get at and answer the question behind the question. We still get some people who try to respond to the literal question (the "I'm not allowed to make recommendations" crowd), but they are in the minority.
I would not generally recommend using NPS in a B2B survey--instead, I would ask a more tailored question that gets at what you're really trying to understand in the customer relationship. But if you do choose to use NPS for B2B (or if you're required to use it), keep these things in mind:
Paying attention to these will help you get the most out of NPS in your B2B survey.
We've published the 102nd issue of our newsletter, Quality Times. This month I admonish the reader to stop and think before collecting useless data, and discuss what kinds of data are more or less useful.
Email subscribers should have already received their copies. If you want to get our newsletter every month as soon as it comes out, you can subscribe here.
One of my personality quirks is that I tend to have a very low tolerance for organizational stupidity. In other words, it really bugs me when bureaucratic organizations make me do dumb things for no good reason.
As an example, this week we decided to order some custom-imprinted M&Ms as a cute item to give to our clients. We wanted to get candies with our logo and some customer experience-themed slogans, and Mars, the company that makes M&Ms has a slick website for designing and ordering your custom candy.
Except that the convenient online ordering is only for consumer orders. If you're a business you can't place an order, just download a brochure and ask for someone to contact you. And so begins the, ahem, unique process for ordering custom M&Ms as a business.
(You may wonder why I didn't just use the online form and pretend to be a consumer. The choices and pricing is the same either way, but orders placed online aren't allowed to include logos, though other pictures are permitted.)
Filling out the online form results in an email from an M&M sales rep a day later, with a brochure of product choices -- the same brochure you can download from the M&M website.
Once you've decided what you want, you need to email the M&M rep with all the order details. This is exactly the same information that the online ordering tool collects, except that as a business you can't use the online ordering tool. Except that you can't include your credit card information in the email because that's not secure. So someone from M&M will schedule a time to call you on the phone for the sole purpose of getting your credit card number.
Finally, you will need to sign and return a contract, emailed to you as a PDF file with instructions to fax or email back the signed copy.
I was already getting a little triggered by having to execute a four-page contract just to order candy. But I dutifully filled out the blanks and applied my electronic signature -- an image of my handwritten signature -- to the PDF and emailed it back.
This, I was promptly told, was unacceptable. Electronic signatures were not permitted, I needed to print out the document and scan a signed copy, and return that.
From a legal perspective this is complete nonsense. Under U.S. law, an electronic signature is as valid as a paper signature, and it's been that way since the ESIGN Act was passed in 2000. What's more, pretty much anything counts as an "electronic signature" as long as it was done with the intent of signing the document. No fancy cryptographic protocols are required.
So I printed out the already-signed contract, scanned it back in, and emailed the scanned document back. I will admit that this was a little cheeky on my part, but since the result is exactly the same as if I had signed a paper copy with a pen (remember, my electronic signature is just a scanned image of my handwritten signature), I figured this would make them happy.
It didn't. Still not acceptable, because they noticed that the signature in the scanned document was the same image I used in the electronically signed document. Busted! The only acceptable way to sign the contract was to print an unsigned copy, sign it by hand, and return a scan of the paper document via fax or email.
One other thing: After making me kill a tree to sign a paper copy of the contract, M&M would not even allow me to mail them the paper copy. Fax or email, nothing else. Apparently their offices are paperless, even if we're not allowed that luxury.
I did what I had to, because even this monumental level of organizational stupidity wasn't going to keep me from ordering custom-imprinted M&Ms. I removed my electronic signature from the document (which was not an easy task: the software I use to sign PDF files makes it very hard to remove the electronic signature once applied to a page), printed a hard copy, signed it with a pen, and scanned that copy back in to return. I even recorded me signing the paper and included an animated GIF of the momentous event.
(As an aside: After all this effort, I have not yet received a copy of the countersigned and fully executed contract from Mars Corp. I have my doubts about whether they bother going through the same process on their end that they force their customers through.)
So what's the lesson in all of this? In the end I did order the custom M&Ms, but it was a frustrating and difficult process. What would have taken under ten minutes on their consumer site wound up taking two days because of all the back-and-forth and inefficiency of working through email, and because of their excessively rigid application of obsolete rules about contracting.
While they may have gotten my business this time, in the future I'm likely to look elsewhere.
Apple's new laptops have been generating complaints about the battery meter. The "time remaining" display has a bad habit of jumping all around and not giving the user meaningful information about how much time they can actually keep using the computer.
Getting this display right is a tricky problem, and it's a nice simple example of a situation that's common to a lot of dashboards and data visualization. The challenge is that you are trying to communicate a relatively simple and actionable message with a very complicated underlying system, where the person receiving the message isn't an expert and can't be expected to become an expert.
In the case of Apple's battery meter, the user wants to know roughly how long he can keep using the laptop before plugging in. But the complicated reality is that the laptop's power usage can vary second-to-second, and it's not always obvious what's driving the changes. You may be happily surfing the web and barely sipping the battery, but should you visit a page with a lot of animations (or worse--scripts to track your web viewing and serve you ads) that suck up the CPU, your battery usage will spike and time available will plummet.
Juice Analytics took a look at this problem recently, and provided some different ways to better communicate the nuances of laptop battery life. In all likelihood, none of the options will be completely satisfying to the typical user who just wants to know if he has enough battery to watch The Matrix to the end.
But just like in the business world, where leadership may want simple answers to complex questions, sometimes it does a real disservice to give people the data they think they want. The challenge is to find a simple way to communicate the data they actually need.
Every now and then you hear about a customer-hostile practice that's so outrageous you wonder how they stay in business.
There's a used book store in England that charges a 50-pence "browsing fee" just to walk in the door. Apparently the proprietor doesn't like people who look at the books without buying anything, and this fee is intended to separate the serious buyers from people who just want to deplete the oxygen in the room.
Clearly this used bookstore has so many customers that they need to do something to keep the unruly mobs at bay. Either that, or they don't want to actually sell any books. Since customers have been complaining about the fee online since at least 2010, it seems that this practice hasn't managed to put them out of business very quickly.
It's the beginning of a new year, which means it's time for pundits and prognosticators to pull out their crystal balls and make predictions about the twelve months to come.
Bruce Temkin, for example, has identified "Purpose" as the Customer Experience theme of 2017.
Who am I to disagree?
But in my view, such trend articles miss the bigger picture, which is that the important facts of the Customer Experience profession will be pretty much the same in 2017 as they were in 2016 and earlier years. These are the non-trends, the things that don't change, and most of them are more important than the trends.
So here I present my Customer Experience Non-Trends for 2017. Not only are most of these non-trends more important to the average CX professional than the Trends, you can read these safe in the knowledge that in January 2018 I can just republish the same article with a different date, just as this year's article is the same as my 2016 Non-Trends article with a new date and a few details changed.
The companies delivering a great customer experience almost always have leadership actively engaged in continuously trying to deliver a better experience. Conversely, companies where leadership views CX as a one-time project, or something to delegate, generally don't succeed in delivering a superior experience.
The lesson here is simple: if you want to improve the customer experience in your organization, the most important thing you can do is get the senior leadership to care and make it a personal priority.
Sweat the details. A grand strategy or a new piece of technology will not, by themselves, move the needle on your customer experience (though the right strategy and tools definitely make the job easier).
Unfortunately, "sweat the details" is not a sexy message and it doesn't help sell software and services. Many vendors make the empty promise that their solution will, by itself, transform your CX effort. Don't believe it. There is no magic bullet.
The field of Customer Experience has made great strides over the last decade or so, but it's still not easy. We've finally gotten to the point where most companies will at least say that the Customer Experience is a priority, but many of them have yet to internalize it. The leadership doesn't yet care enough to dedicate the needed resources, or they think that because they have a CX team the problem is solved and they can mostly ignore it.
So in a lot of places, the role of the CX professional will continue to revolve around getting leadership attention, finding the easy wins, and internal evangelism. This, unfortunately, is not likely to change any time soon.
The sweet spot of customer experience is when your whole organization is focused on creating a better experience for customers, which makes customers want to do more business with you, and that makes employees want to help customers even more. Customer Experience becomes a positive feedback loop.
The unacknowledged truth is that most employees genuinely want to do a good job and have a positive impact on their customers. It's one of the most satisfying things we can do in our careers. A strong focus on CX creates not just more satisfied customers but also more satisfied employees.
Here's hoping for a terrific 20162017!
We just published the holiday edition of Quality Times, our semi-regular newsletter. As we often do this time of year, this issue has a lighthearted take on Christmas and the customer experience.
As always, I hope you enjoy the newsletter. If you want to get new issues as soon as they come out, please subscribe using the form next to the newsletter.
Let us put our expertise in customer feedback to work for you.