The internet has made it easier than ever to conduct in-house market research, but that doesn’t mean you should.
It’s easy to understand the allure of do-it-yourself survey websites like SurveyMonkey and Google Forms. Hiring a marketing firm to conduct your research is expensive — even a simple readership survey will typically cost thousands of dollars. For small companies, it can be hard to justify those costs when the do-it-yourself alternative is essentially free.
Unfortunately, the apparent ease of do-it-yourself surveys is largely misleading. As more and more companies move their market research in-house, the quality of surveys is dropping rapidly, and the results of those surveys can end up distorted or even entirely meaningless.
Although online survey tools are easy to use, actually designing a valid survey can be very difficult. Users often struggle with important aspects of the market research process, such as finding a representative sample or preparing actionable reports. The most glaring errors of all come in the form of badly written questionnaires.
Having previously worked in market research for several years, I’m always curious to see new questionnaires, and I’ll gladly take a survey whenever I’m asked. It pains me to see that again and again, questionnaires contain simple mistakes in the wording of their questions that can end up leading to completely invalid results.
If you’re creating your own survey — whether it’s for in-house market research, a school project, or even just for fun — it’s essential to proofread for mistakes before you begin to collect responses. In addition to going over the questions on your own, have a few friends or colleagues take the survey to see if they catch any errors that you missed.
Below, I’ll explain and provide examples of some of the most common mistakes that I see in amateur questionnaires. This list isn’t exhaustive, but I hope it can help guide your understanding of how a poorly worded question can affect the validity of your survey’s results. The overall principle behind these examples is that a question should never force or encourage a reader to give any type of inaccurate response.
Providing overlapping answer choices
How old are you?
Overlapping answer choices should be an easy-to-spot mistake, but I’m surprised to regularly see this appear in questionnaires. In the example above, the three answers in the middle each overlap by one year. If a person is exactly 30 or 40, they would have to answer choices which both apply to them. When it comes time to tabulate the results, you’ll have no way to sort out these respondents.
A similar mistake is to leave gaps between the answer choices. For example, having an answer that says “21–29” followed by an answer that says “31–39.” This would leave someone aged 30 with no applicable answer choice. Both versions of this mistake seem to occur most frequently in the demographic section of surveys. A careful proofread should easily catch them.
Forcing an opinion on the survey taker
Do you support issuing a new education bond?
🔘Yes, education is the most important issue facing our state.
🔘No, education is already overfunded.
Another thing to avoid is forcing opinions on your respondents. Each answer choice should be as simple as possible, and they should never add additional opinions that the survey taker wouldn’t necessarily hold. In the example above, the survey is forcing the respondent to take an extreme stance on education. A respondent might agree with answering this question “yes” or “no,” but not for the provided reasons. By wording the answers in such a specific way, the survey could end up manipulating responses. Readers tend to hate questions like this, so the survey’s overall response rate is also likely to drop.
Not providing enough options
What type of cell phone do you use?
It’s important to always provide enough options that anyone taking your survey will be able to accurately answer the question. In this example, the survey asks about the respondent’s cell phone OS, but only gives the two most popular systems as choices. To fix this, you don’t need to add every option in the world. Instead, you can simply add an “other” choice, typically accompanied by a text box for the user to fill in what they use.
Mixing up checkboxes and radio buttons
What is your annual household income?
☐ Under 20,000
☐ Over 100,000
Another common mistake in amateur surveys is to mix up checkboxes (☐) and radio buttons (🔘). Each of these symbols has a standard meaning in questionnaires: a checkbox allows users to choose multiple answers for a question and a radio button limits users to a single answer. The example above should actually use radio buttons because each respondent could only fall into one of the income brackets.
Some survey creation software allows you to use each of these symbols however you want (for example, you could use radio buttons for a check-all-that-apply question). Even if the software allows you to do this, you should still stick with the standard usage rules to avoid confusing your readers.
Misusing “required” questions
Why did you buy your most recent suit? (required)
🔘For a personal event
This final common mistake is slightly more subtle than the others. Most survey creation tools allow you to specify certain questions as “required.” This means the software will not allow a reader to submit a survey until the required question is answered. These required questions should be used incredibly sparingly. Unless you have a question that is absolutely central to your study, you should leave it up to the users whether or not they answer.
One problem with required questions is that they are likely to reduce the response rate of your surveys. If a user is unsure of an answer, they may decide to just stop taking the survey altogether. Required questions can also encourage users to randomly choose an answer just so that they can move along to the next question.
Lastly, required questions can sometimes force users into a choice that doesn’t make any sense for them. In the example above, the user is being forced to answer why they bought their most recent suit. If the respondent doesn’t own any suits, how could they possibly respond to this question accurately?
Web-based do-it-yourself survey software has lead to a proliferation of bad surveys, but it doesn’t have to be that way. Not every company has the budget to hire a market research firm, but they should at least have the time to carefully proofread and test their surveys. If you’re planning to conduct your own market research, make sure to think through the questions and check that you aren’t forcing nonsensical responses. A sloppily designed survey will inevitably lead to invalid results, which will often do more harm than good.