Are You Ready? Making Sure Online Survey Answer Options Are Complete

When designing online surveys, it is extremely important to ensure that the answer options we offer make sense. A common mistake is providing insufficient answer options. Let’s go through a few scenarios to demonstrate this pitfall.


Example 1:
Consider this question, “How many books do you typically read per month?” and the answer options: “One”, “two”, “three to five”, and “over five”. Are those answer options complete? No, there was no option for “none.” Plus, for an avid reader, the option of “over five” is not much of an accurate gauge. Thus, the results might end up that there is a large sum of people picking “five or more”, and you have no idea how much “more” it really is. It would be more beneficial if additional categories were added to the range at the top such as “six to nine”, “ten or more”.

Example 2:
Continuing with the book theme, “Which of the following is your favorite genre of book to read?” with answer options being “mystery”, “science fiction”, “romance”, and “horror”. Is this complete? No, a lot of genres are missing such as non fiction, short stories, poetry, etc. It is important to remember that what seems like a complete list to you, may not be to the people in your survey population.

Example 3:
Imagine we are in the CRM software business and our clients are IT managers. We may want to ask, “Which features of our CRM software do you feel need improvement?” and offer the options “Feature A”, “Feature B”, and “Feature C”. Only providing three features is likely not a complete list, especially in regards to a product that is complex and multi-featured. Even if you have hypothesized that there are only three features that they’ll want to improve, you should always at least offer an “other, please specify” to catch anything else. Otherwise, you force people to select options that are not true, thus affecting the accuracy of your results. Limiting the number of options causes the respondent to think “Oh, I have to pick something? Well I’ll pick this one. It’s not really true, but it’s truer than the other ones”.

Faulty Assumptions

Another key aspect to scripting great answer options for our online surveys is to avoid assumptions. I recently saw a project where a survey design read, “Which of the following computer brands do you have in your home?”, with the answer options of “Apple”, “Dell”, “Toshiba”, “Gateway”, “Hewlett Packard”, “Sony”, and “Other, please specify”. The problem is that it assumes everybody has a computer in their home. This is not universally true. So we need to provide the option, “None, I do not have a computer in my home”, in order for your answer options to be complete.

This brings up another interesting point. Could they have added that answer in the “other, please specify:” option? Well, yes, but that would create a little bit of a nightmare for you when you’re doing your data analysis because now “other, please specify” includes both a mix of brands and “does not apply.” It is best to avoid that.

Staying up to date on demographic trends is crucial as well. Some things apply these days that just wouldn’t have in the past. For example, I recently spoke to somebody who was doing research in the U.K. and found the feedback on their initial survey design was that their gender question was missing a category. Gender? Isn’t it male and female? Well, to be politically correct for this company’s target market, they needed to include a transgender option. Things change, so your old standard answer options need to as well.

Quick Tips

Easy ways to ensure your answer options are complete before proceeding:

1. Ask a few friends or colleagues take the survey. Find out if they were able to answer the questions honestly, or if they felt that they sometimes selected an option because it was the best option–even if it was not completely accurate?

2. When necessary, add “not applicable” or “other, please specify”. They are great safety nets.

ABOUT THE AUTHOR: Kathryn Korostoff
Kathryn Korostoff teaches Market Research best practices at AYTM.com for Ask Your Target Market, and is the president of Research Rockstar, delivering Market Research training and support services. She can be reached at KKorostoff AT ResearchRockstar DOT com.
  • Dan Wardle

    Hi Kathryn,

    Good advice – I think quick tip #1 is a must for every survey.

    Finding a solution to “faulty assumptions” … we often use a auto-fill text box for this sort of question. Behind the scenes, we still define our list of answers, but the actual question will accept any answer that is entered. For example:

    What university did you attend?
    [text]

    I went to Nottingham, so when I start typing “Not” the form identifies this and presents “University of Nottingham” and “Nottingham Trent University” that I can click on so my answer is consistent with the majority of other participants (I could ignore this and still enter whatever I like). It’s not perfect – but it does make 90/95% of the answers consistent and a lot more manageable.