Written by Scott Smith, Ph.D. January 14, 2013
Details, details, details.
Creating surveys that yield actionable insights is about details.
And writing effective questions is the first step.
We see common mistakes that keep survey questions from being effective all the time.
Here are the 7 most common.
Subtle wording differences can produce great differences in results. “Could,” “should,” and “might” all sound about the same, but may produce a 20% difference in agreement to a question.
In additions, strong words such as “force” and “prohibit” represent control or action and can bias your results.
The government should force you to pay higher taxes.
No one likes to be forced, and no one likes higher taxes. This agreement scale question makes it sound doubly bad to raise taxes.
Wording alternatives can be developed. How about simple statements such as: The government should increase taxes, or the government needs to increase taxes.
How would you rate the career of legendary outfielder Joe Dimaggio?
This question tells you Joe Dimaggio is a legendary outfielder. This type of wording can bias respondents.
How about replacing the word “legendary” with “baseball” as in: How would you rate the career of baseball outfielder Joe Dimaggio?
Multiple choice response options should be mutually exclusive so that respondents can make clear choices. Don’t create ambiguity for respondents.
Review your survey and identify ways respondents could get stuck with either too many or no correct answers.
What is your age?
What answer would you select if you were 10, 20, or 30? Questions like this will frustrate a respondent and invalidate your results.
What type of vehicle do you own?
This question has the same problem. What if the respondent owns a truck, hybrid, convertible, cross-over, motorcycle, or no vehicle at all?
Questions that are vague and do not communicate your intent can limit the usefulness of your results. Make sure respondents know what you’re asking.
What suggestions do you have for improving Tom’s Tomato Juice?
This question may be intended to obtain suggestions about improving taste, but respondents will offer suggestions about texture, the type of can or bottle, about mixing juices, or even suggestions relating to using tomato juice as a mixer or in recipes.
What do you like to do for fun?
Finding out that respondents like to play Scrabble isn’t what the researcher is looking for, but it may be the response received. It is unclear that the researcher is asking about movies vs. other forms of paid entertainment. A respondent could take this question in many directions.
Sometimes respondents may not want or be able to provide the information requested.
Questions about income, occupation, finances, family life, personal hygiene, and personal, political, or religious beliefs can be too intrusive and be rejected by the respondent.
Privacy is an important issue to most people. Incentives and assurances of confidentiality can make it easier to obtain private information.
While current research does not support that PNA (Prefer Not to Answer) options increase data quality or response rates, many respondents appreciate this non-disclosure option.
Furthermore, different cultural groups may respond differently. One recent study found that while U.S. respondents skip sensitive questions, Asian respondents often discontinue the survey entirely.
What is your race?
What is your age?
Did you vote in the last election?
What are your religious beliefs?
What are your political beliefs?
What is your annual household income?
These questions should be asked only when absolutely necessary. In addition, they should always include an option to not answer. (e.g. “Prefer Not to Answer”).
Do you have all of the options covered? If you are unsure, conduct a pretest using “Other (please specify)” as an option.
If more than 10% of respondents (in a pretest or otherwise) select “other,” you are probably missing an answer. Review the “Other” text your test respondents have provided and add the most frequently mentioned new options to the list.
You indicated that you eat at Joe’s fast food once every 3 months. Why don’t you eat at Joe’s more often?
There isn’t a location near my house
I don’t like the taste of the food
Never heard of it
This question doesn’t include other options, such as healthiness of the food, price/value or some “other” reason. Over 10% of respondents would probably have a problem answering this question.
Unbalanced scales may be appropriate for some situations and promote bias in others.
For instance, a hospital might use an Excellent – Very Good – Good – Fair scale where “Fair” is the lowest customer satisfaction point because they believe “Fair” is absolutely unacceptable and requires correction.
The key is to correctly interpret the scale. If “Fair” is the lowest point on a scale, then a result slightly better than fair is probably not a good one.
Additionally, scale points should represent equi-distant points on a scale. That is, they should have the same equal conceptual distance from one point to the next.
For example, researchers have shown the points to be nearly equi-distant on the strongly disagree–disagree–neutral–agree–strongly agree scale.
Set your bottom point as the worst possible situation and top point as the best possible, then evenly spread the labels for your scale points in-between.
What is your opinion of Crazy Justin’s auto-repair?
The Best Ever
This question puts the center of the scale at fantastic, and the lowest possible rating as “Pretty Good.” This question is not capable of collecting true opinions of respondents.
There is often a temptation to ask multiple questions at once. This can cause problems for respondents and influence their responses.
Review each question and make sure it asks only one clear question.
What is the fastest and most economical Internet service for you?
This is really asking two questions. The fastest is often not the most economical.
How likely are you to go out for dinner and a movie this weekend?
Even though “dinner and a movie” is a common term, this is two questions as well. It is best to separate activities into different questions or give respondents these options:
Dinner and Movie
While not totally inclusive, these seven tips are common offenders in building quality questions.
Focus on creating clear questions and having an understandable, appropriate, and complete set of answer choices. Great questions and great answer choices lead to great research success.
What are some mistakes you’ve made that you wish you wouldn’t have? Comment below and let’s discuss!
Basic Marketing Research Volume 1: Designing Your Study
Ever wish you could get straight to the point when it comes to marketing research? Well here’s your chance. Get “Basic Marketing Research Volume 1: Designing Your Study” written by Scott Smith Ph.D., along with monthly research tips delivered directly to your inbox. Just type in your email address below.
Get the book FREE