Creating online surveys is as much an art as it is a science.

It involves attention to detail in the design and flow of your survey. Creating an effective survey that yields actionable insights can be difficult.

Effective survey design and flow gives power to your research. But great questions are the foundation for great research.

There are fundamental best practices for creating survey questions that all researchers must know.

Here are 4 tips for creating surveys that work.

1. Keep It Simple

Do you remember taking the SAT or ACT? It’s a long and boring process.

Your average survey respondent can start to feel that way about 15 minutes into a survey. Fifteen minutes is a good upper-limit for most surveys.

When a survey is too long, three bad things can happen:

  1. Respondents drop out: They simply quit taking the survey. It costs money to find respondents, and a high drop-out rate can not only cost a lot, but can influence the quality of your results. Having a reward for completion can reduce drop-outs, but you can’t stop it completely.
  2. People stop paying attention: Remember your elementary-school classmate who just filled in random bubbles during a test? He grew up. If it takes too long to take your survey, he might do it again. We actually see this a lot, and encourage researchers to use attention filters.
  3. Clients get angry: The irony of upsetting customers with an overly long satisfaction survey is not lost on your respondents.
    The best way to collect quality data is to keep your surveys short, simple, and well organized.

2. Use Scales Whenever Possible

Scales are more than a little important. They’re the subject of an entire chapter of my book which you can download for free.

Rather than asking respondents a basic yes or no question, use scales that measure both the direction and the intensity of opinions.

This is critical for research.

Someone who “Strongly Supports” a decision is very different from someone who only “Slightly Supports” it.

Scales extend the power of analysis from basic percentages to high level analyses based on means and variance esti- mates (like t-test, ANOVA, regression, and the like).

Use scales whenever you can. You will get more information from each question.

3. Keep Coded Values Consistent

Every survey response, option, question, or answer is coded as a numeric value that is reported as a percent of responses or as a mean, median, range, etc.

These values are the basis for analysis.

  • Mean: Often referred to as an average, it is the sum of all the values divided by the number of values.
  • Median: The middle point in a data set. To determine the median, lay out a distribution from lowest to highest and select the middle value.
  • Range: The highest and lowest data points in a distribution form the range. VARIANCE: A dispersion measure of how far a set of numbers is spread out.
  • Example: Assuming we have data points 1, 2, and 6: Mean: 3 = (9 / 3) Median: 2 Range: 1-6 Variance: 7

Values must be coded consistently. Generally, we assign the highest value to the best outcome (ie “Strongly Agree” that customer service is responsive) and then move down from there.

For simplicity, keep your scale direction consistent throughout your survey. This makes it easier for respondents to answer and for you as a researcher to conduct your analysis.

If scales have the same scale of points, you can quickly compare responses to different questions. For example, if a survey asks respondents to rate a series of statements from Strongly Disagree to Strongly Agree, the responses are given these values:

Standard scaling helps managers to quickly understand customer service ratings by simply looking at averages.

For example, once managers understand that a 5-point agreement scale is being used, they could be given the mean results for the following customer evaluation (agreement) statements:

  • I am completely satisfied with the customer service — 3.15
  • The customer service is prompt — 4.12
  • Customer service representatives are polite — 4.67
  • Customer service representative are knowledgeable — 2.08

Since all the statements are positive and the values are scaled consistently, a higher mean reflects better results in that area. A manager can look at these means and quickly identify the 2.08.

We see that customer service representatives are prompt and polite, but they don’t seem to know what they’re talking about. As a result, overall satisfaction with customer service is perhaps much lower than it could be.

You can reverse scales (or word questions negatively) to encourage respondents to read more carefully.

However, if you use reversed scales or negative wording for some items, be sure to recode the scales so that all scales point in the same direction. This will allow you to quickly compare multiple areas of customer service. (You can do recodes easily in Qualtrics.)

The simplest solution is just to keep all scales consistent throughout every survey.

4. Explain Why

Respondents are more likely to help you if they see something of positive value for them.

Value offerings can range from a very general alttruistic appeal for their help to a very specific offer of an economic incentive. For instance, with a customer feedback survey, you can explain that feedback will help improve customer service.

Here are some quick examples:

  1. Make it specific to them: With employee evaluations, you can explain that feedback will be used to determine awards, promotions, and pay raises and will help management make organizational decisions that will affect them.
  2. Explain unexpected questions: For instance, if it’s important for you to ask toy store customers their preferred color of jeans, you might want to explain why that is relevant.
  3. Justify requests for sensitive information: For instance, you can explain that purchasing habits will only be analyzed in aggregate for benchmarking purposes or that results will not be shared outside your organization.

So what do you think? What are some tips you’d give to fellow researchers? Leave a comment below and let’s discuss.

This post is part of the Online Surveys 101 series put together by Scott Smith Ph.D. Click here to see more.