Rules of Thumb for Survey Length

Issue: 
42
November 2009

In This Issue


Looking for VUI Design Experiments

Over the past year, Vocalabs conducted experiments in voice user interface design in partnership with three different companies. We presented the results at the 2009 SpeechTEK conference and wrote about them in our last newsletter.

This was a hugely successful undertaking to improve the state of the art in VUI design. It's hard to get speech vendors to spend money to perform carefully controlled experiments and publish the results for the benefit of the community at large; it's even harder to get individual clients to take such a civic-minded approach for their own projects.

Nevertheless, most of us can agree that the speech industry benefits greatly from a more rigorous, open, and transparent knowledge base of VUI design practices.  This leads to better designs, more user acceptance, and greater project success.

Vocalabs is interested in seeing better design (and more emphasis on testing, of course), and with the success of these experiments last year we're going to step up to the plate again.  Vocalabs will perform usability surveys for free for any designer or company testing VUI design practices for the benefit of the community as a whole.

Here are the criteria:

  1. You have to have a well-designed experiment testing a design practice of general interest, not a question relevant to only a particular design.
  2. You have to commit to analyzing the data and publicizing it to the VUI design community as a whole.
  3. You have to find someone to host the test application(s) (we can't do that ourselves).
  4. The raw data will be made available to all interested parties in the VUI design community (peer review and alternate analysis are especially welcome).
  5. VocaLabs has to be identified as a sponsor of the research in all publication, presentations, etc.

If you're interested or have questions, contact Peter Leppik.


Rules of Thumb for Survey Length

The length of a customer satisfaction survey determines how many different pieces of the customer experience you can measure, but just as important, it impacts your survey completion rate, the dropout rate, and whether customers come out of the survey with a positive impression.

Crafting a survey always involves balancing competing interests, and there's the temptation to throw the kitchen sink into a new survey, asking every possible question so as to not have to tell any constituency "No." This approach has led to a lot of very bad surveys: much too long, with repetitive questions, and poor response.

At the other end of the spectrum is the idea of a one question survey. This is most often associated with the Net Promoter question, though just about any satisfaction-related question will serve the same purpose. A one-question survey is a useful antidote to the kitchen sink survey, but potentially misses a lot of valuable information for doing root cause analysis and impact-opportunity analysis.

It's necessary to strike a balance between getting enough feedback from the customer to facilitate analysis and making the survey so long that customers drop out partway through or refuse to take it. In our work, we've found that, all else being equal, shorter is usually better and that there's generally a point at which survey completion rates start to drop off.

Live Interviews: Five Minutes

A tightly-constructed live interview can have 20-30 questions in five minutes, and we've found that customers are more likely to refuse a survey when told it will be six minutes or longer. Often the survey design challenge is in trimming the individual questions down to the point where they flow smoothly and only take a few seconds each. Avoid the temptation to reuse overly-wordy questions from a written survey, when a simple yes/no or free response questions will suffice and go much faster.

IVR Surveys: Five Questions

Customers have less patience for an automated survey than a live interview, and are more willing to simply hang up when they decide the survey has gone too long. Five questions seems to be about the limit before the dropout rate starts increasing significantly--but a badly-designed survey can make people hang up faster, and a well-designed survey can convince them to stay on longer.

Web Surveys: 15 Questions on One Page

Online surveys are so cheap and simple to put together that the temptation to build a "kitchen sink" survey can be overwhelming. Many surveys today are put together in a multi-page format with only one or two questions per page, and little feedback to the customer about how much of the survey remains. Each page you make the customer go through increases the odds that the customer will drop out at that point, and increases the feeling that the survey simply never ends. We've found that putting 15 questions on a single web page works very well, leading to a low dropout rate with nearly every person who starts the survey answering all the questions.

Printed Surveys: One Sheet of Paper

Doing an actual written survey on a piece of paper is quickly going the way of disco and white polyester suits, but some of these animals still exist. Putting the entire survey onto a single sheet of paper gives the perception that it's only a minimal imposition on the customer's time. Many of the written surveys we see today are down to postcard size with only a handful of questions, which makes the hurdle to completing the survey not the time it takes but the fact that the customer has to remember to do it at all. Oh yeah, and you should never make the customer pay for postage.

Our rules of thumb represent the point where we start telling a client that a survey is too long and needs to be trimmed. There is still a lot of room in the world for a longer form survey or interview, but for customer satisfaction measurement and tracking, these lengths help maximize the response rate without having to spend too much time or money on recruitment.