In this blog post, I’m writing about survey design. In particular, I’m focusing on longer surveys we usually conduct when we try to:
- understand user attitude towards the brand,
- stay aware of competitors,
- evaluate new products,
- understand customers better.
By reading this post you will learn:
- how to form questions,
- how many surveys you should do,
- what the common mistakes are when designing the survey,
- what the specifics are when you are conducting an in-house survey,
- cognitive biases you as a designer might have,
- and cognitive biases users might have.
A survey is a sub-sample or cross-sample of the general population, dependent on the criteria you are looking for demographically.
Usually, we use both qualitative and quantitative questions. An example of a quantitative question is: »How often do you use our service?« Several options, like once per week, are offered. This question segments the users. And with the next quantitative question we want to gather real user feedback, therefore we need qualitative open questions like: »Which service do you like most and why?« With a qualitative question, you want to capture users’ feedbacks, real thoughts they have on the matter, and also the language they use.
How to form questions for a survey
Forming questions is a crucial part of survey design. When you are conducting a survey, you must have a very clear idea of what question you want to have answered. What specific actions are you going to do with the answer? How will the results influence your campaigns? A well written survey question is:
- not asked in a leading way,
- directly addressing the desired info.
*Most survey questions should be close-ended. Use only a few (one or two) open-ended questions to gather feedback or customer opinion on the most important topic or aspect you are exploring.
How many surveys should you do?
This definitely depends on the target audience. Sometimes you are dealing with a very specific audience and you can’t even get one hundred people to take the survey. In this case, go with what you‘ve got. Any answers are better than no answers. But in general, we are huge believers in the power of numbers. The more answers you can collect, the more accurate results you will get at the end.
Rule of thumb is you need at least two hundred answers before you start analyzing them and making any conclusions.
Common mistakes when designing a survey
Scaling error. If you have people scaling something from one to five, then one should be the most negative answer and five the most positive one. Not the other way around. We are used to this scaling system and if you turn it around, you create unnecessary confusion. Lots of people won’t read the instructions carefully enough and that might influence the end result.
Questions that don’t communicate. Those are questions that don’t relate to the target audience. Often because companies are using internal jargon or field-specific verbing and terms users simply don’t understand well enough or correlate with.
Surveys too long. Marketers often have focus issues. They want to ask everything in the survey. But nobody likes taking long surveys, and if you ask too many questions, you get the so-called error of central tendency. For example, if you have several questions on a 5-point scale, from a certain question on all answers are three in average. When the answers start closing in around the midpoint of the scale, you lose your audience. All answers from that point on are useless.
Not keeping a neutral learning curve. The average survey responder is learning from the survey. People answering surveys want to help you. Often they are also paid to do so. When answering questions, they are also learning and developing an idea of what you are trying to achieve with the survey. And they become biased in favour of that idea, when answering your questions.
Conducting an in-house survey
If you are doing a survey with your existing customer base, acknowledge that they are people who, at least once, committed to some kind of a relationship with you.
Existing customers will always be biased towards your company. In a positive or negative way. It depends on the experiences they have with your product or service.
Keep it short. You don’t want to annoy your customers. A survey that really works is ten to fifteen questions long.
Don’t over assume product knowledge. Do not assume your customers know you, just because they once bought something from you. How much do you know about a brand that made your toaster? It is a good practice to do a short introduction about who you are and what you do.
Cognitive Biases in Survey Design
Reading the room. When the researcher tells the client what he believes the client wants to hear. For example, when making an introduction. And by doing this, the researcher puts the whole survey in the wrong context.
Order bias. When the items listed higher in the survey receive better scores. If you have a question where the user has to choose one out of twenty options, the options that are closer to the top will be chosen more often. You can easily pass that bias, if you choose a provider who enables you to randomize the options list order.
Cognitive Biases on the User Side
It is not just that you, the survey architect, can become biased, but the user can be biased too.
Use simple language. Avoid technical and statistical jargon. Explain everything like you would explain it to a fifteen-year-old teenager. You don’t want to put the user in a position where they feel like they don’t have enough knowledge for providing the answers.
Debrief customers ahead of time. Debrief customers before conducting the survey. Tell them why you are doing the survey, why their answers are important to you, and also let them know about the biases they might have.
Selective perception. Customers will pay more attention to what they already agree with.