We get a lot of questions at Qualtrics about survey fatigue — it’s a common story, you’re getting great insights from your customer surveys, you want more, so you send more surveys. All of a sudden your response rates start dropping and the quality of your insights suffer as a result.

Welcome to the world of survey fatigue.

What is survey fatigue?

Survey fatigue is what happens when your audience becomes bored or uninterested in your surveys and it normally happens in two ways:

  • Before taking the survey — overwhelmed by the sheer volume of requests for feedback, customers decide not to even begin your survey. The result is a drop in response rates as fewer customers decide to give you feedback.
  • During the survey — this happens after someone has started the survey and is usually caused by poor survey design such as including too many questions, a high proportion of open text fields or asking the same question repeatedly. As a result, respondents can drop out midway through or lose interest and speed through, giving you inaccurate data.

Believe it or not, customers don’t love surveys quite as much as you do. Yes, they’re happy to respond and give their feedback, but it’s important to remember that every time you request feedback, you’re asking them to put in effort.

And the more effort you ask a customer to put in, the less likely they’re going to be to want to do it.

How often to send customer surveys

There’s no hard and fast answer here, unfortunately — how often you send survey invites depends entirely on the types of questions you’re asking, how many, and who you’re sending them to.

It all comes back to effort again. Answering a survey every day for a company you only interact with every few weeks is high effort. Responding once a month to a company you interact with every week is much lower effort.

In B2B, best practice is to send your customer surveys quarterly. A B2B audience typically interacts with a company much less frequently than a B2C audience, so it makes sense that you should reach back out to them much less frequently.

For B2C, a good rule of thumb is to base your survey frequency on how often customers interact with you and multiply it by 2. So if your customers interact with you monthly, you can send your survey every two months; if they interact with you weekly, send it every two weeks.

As with any rule though, there’s always an exception. For transactional surveys such as after an eCommerce purchase, you can ask for feedback with every transaction — however, with that frequency it needs to be ultra-low effort so keep it to a maximum of 4 questions if you want to avoid survey fatigue creeping in.

Want more survey best practice tips?


Design your survey questions to avoid mid-survey fatigue

So that’s survey frequency covered, but what about the design of your survey — how can you ensure it’s low effort enough to reduce the risk of fatigue halfway through?

Here’s 5 useful things to consider when designing your survey questions to help reduce effort and minimize the risk of fatigue:

1. Ask direct questions

If respondents don’t understand what you want from them, they have to think harder about their answers. So keep your questions direct and unambiguous.

2. Ask one question at a time

It may sound obvious, but it’s very easy to fall into the trap of conflating two questions into one.

For example, ‘What is the most stylish and affordable clothing brand?’ contains two questions. It’s better to ask two separate questions, ‘What is the most stylish clothing brand?’ followed by ‘What is the most affordable clothing brand?’

This ensures the respondent stays focused and doesn’t have to think too carefully about their answers.

3. Provide mutually exclusive choices

To reduce effort, respondents need to be able to make clear and obvious choices. When it comes to multiple choice responses, one of the common mistakes we see is not having mutually exclusive choices.

For example:

Please select your annual salary:

  • $15,000 – $25,000
  • $25,000 – $35,000
  • $35,000 – $45,000

In this example, which option would someone choose if they earned exactly $25,000? The scales create confusion, ambiguity and require more mental effort for the respondent.

Instead, give them very clear choices:

  • $15,000 – $24,999
  • $25,000 – $34,999
  • $35,000 – $44,999

In the second example, regardless of their salary, the respondent will only ever fit into one group, making it a much simpler task and reducing effort.

4. Limit the number of open text fields

Open text fields are great as a researcher as you get to hear, in customers’ own words, how they feel about the experience.

But remember, there’s more effort to filling in open text fields than there is selecting a multiple choice answer. So only use them where open text is essential.

If you’re using open text questions as a follow up to another question, say for example to find out more about why someone chose a particular option in the previous question, you can use display logic to present it only to those respondents it’s most relevant to. So think carefully about your intention with a question — say you want to gather open text responses so your customer service team can follow up with unhappy customers, consider only requesting an answer from those who give a certain score (eg they select ‘somewhat’ or extremely dissatisfied’).

5. Use consistent scale points and structures

When asking respondents to rate something on a scale, whether a 5-point likert scale or a numeric scale like 1-10, make sure you present these consistently. Avoid mixing 5-point scales and 7-point scales in the same survey, and try to stick to the same numeric scales throughout.

Another thing to remember is to keep the order the same. Say for example your first scale questions was:

How satisfied were you with the order process?

  • Extremely satisfied
  • Somewhat satisfied
  • Neither satisfied nor dissatisfied
  • Somewhat dissatisfied
  • Extremely dissatisfied

And then your second question was:

How satisfied were you with the delivery times offered?

  • Extremely dissatisfied
  • Somewhat dissatisfied
  • Neither satisfied nor dissatisfied
  • Somewhat satisfied
  • Extremely satisfied

Mixing up the order like this can create confusion and causes the respondent to have to think about their answer more every time they answer a question.

Want more survey best practice tips? Check out our Question Design Handbook

Use targeting and segmentation to reduce survey fatigue

A great way to combat survey fatigue is to use personalization — it’s been proven to increase response rates and reduce drop-off midway through a survey.

It’s easy to see why too. By sending the right surveys, to the right people, at the right time, you’re targeting the people most likely to respond and in turn, making sure you’re not sending them to people who won’t and could be at risk of fatigue if you keep on sending them.

The technology available today helps you deliver personalization at scale. From simple contact management software that allows you to control frequency and opt-outs to fully featured platforms like XM Directory which provide a detailed view of every customer and their interactions over time.

By building rich profiles and segments, you can start to understand customer preferences and interactions. For example, you could look at how often they usually respond to feedback requests, ensure you’re not asking them about the same topic multiple times, or you could even customize the questions they see based on their past activity to help reduce effort on their part.

Similarly, you could reduce effort on the respondent’s part by sending them multiple, shorter surveys about specific topics – each one is a small effort, and on the back end you can see all their responses together, so you still get a complete view of their experience.

This kind of approach, rather than a long ‘catch-all’ customer survey can help you reduce fatigue and build up a better picture of your customers over time.

See How the XM Directory is Doubling Response Rates

Book a Demo