NEWS CASE STUDIES GUEST BLOGS EVENTS HOW TO

HOW TO

When you have your heart set on quantitative research, it can feel like a bit of an uphill battle. And once you have your survey ready, the most important factor in the success of the research is getting the right people to fill out your questionnaire, as well as keeping them engaged.

The challenge? There are no quick fixes to maximise survey engagement. There are, however, multiple ways to improve surveys depending on your audience and the type of survey you are running. That’s what we’re covering in this blog.

You can scroll through this mammoth piece or use the links below to quickly access best practice in specific areas that you want to learn more about:

+ Question essentials

+ Survey structure

+ A typical survey

+ Getting quality results

+ Giving the right information

+ Survey retention

+ Shortening questionnaires

+ Device compatibility

+ Participant understanding

+ Question types

+ Top testing tips

+ Gamification in surveys

Surveys aren’t considered a “sexy” type of research, but we use the results of unmoderated research all the time due to their effectiveness at quickly getting the right information from the right people. Let’s get going!

Types of questions in every survey

We’ll kick things off with the types of questions typically asked in the survey. There are four types of questions:

1️⃣ Demographic information | Who is the participant?
Here, we split people by information that we want to filter down on later. Common starting questions are age, gender, household income, household type and job industry.

2️⃣ Behavioural questions | What are their behaviours?
These questions focus on tangible previous experiences, such as product usage, purchase history, healthcare experiences and definite future plans (and the decisions taken to make this purchase).

3️⃣ Attitudes and opinions | How do they feel?
These questions typically focus on thinking and feelings, attitudes and awareness of information.

4️⃣ Previous assessments | What is their assessment?
These questions go a little deeper than behavioural questions and cover both behaviour and attitudes. Participants are typically asked further questions about experiences, loyalty and reactions.

Although these questions alone do not directly affect the response rates of some participants, people do expect these questions to be asked in a specific order, and not doing this properly can affect response rates. From our experience, the obvious rule is the demographics come first in a survey.

Just like we (should) warm up before exercise, participants need to be introduced to questions to get them used to the survey format. Demographic questions help give participants the right blend of an introduction to the survey and an understanding of its layout and instructions.

In terms of the remaining order, this depends on your objectives. If you want to learn more about how to order the most important questions in a survey, we’ll touch on this subject a little further on.

How to structure a survey

Structure is the key to designing surveys. The creative process starts with a plan and a structure and to ensure direction and proper “flow” in a survey, there are some top considerations.

1️⃣ Prioritise information against objectives
By all means, communicate your objectives internally, but if the information isn’t essential to the participant, there is no point clouding their minds with something they do not need to know. By the time participants reach the start of the survey, they will get bored!

2️⃣ Treat it like a structured conversation
Conversational language works well in surveys. The less someone has to think about the language used when completing a questionnaire, the lower your drop-out rate will be.

3️⃣ Minimise questions, wording and information
If you think a question is too long, it probably is. There is no definite sweet spot with question length, but that doesn’t mean it isn’t important. Proofread content and cut out unnecessary words.

4️⃣ Be aware of order effect
There is bias you can add without noticing just by incorrectly structuring your survey. Order effect is where the participant knows what you are looking to discover by the previous questions. Structure and question order play a big part in combatting this type of bias.

What does a typical survey look like?

There is no such thing as a typical survey. Although there are similarities and features that define the different types of unmoderated research, and that’s what we’re going to identify. The below isn’t a complete list of all the questions you are likely to have, but this is how user researchers tend to build surveys.

1️⃣ Screening questions
Starting at… the start. Make sure people fit your criteria and anticipate what filters would be useful (much like demographic questions in the previous point).

2️⃣ Preliminary questions
Initially categorising past behaviours relating to the survey.

3️⃣ Key questions
The sweet spot! Recommendations, previous purchase history or advice.

4️⃣ Usage repertoire
Detailed questions off the back of previous answers, expanding on key questions or preliminary questions.

5️⃣ Purchase decision behaviours
This is a bit more subjective. What do participants think is important to them when making purchase decisions.

6️⃣ Brand perceptions
These are questions about the things that matter to the people. For example, “why do you go to that coffee shop?”.

7️⃣ Attitudes towards specific topics
“What do you think about the company’s environmental policy?” is an example of a question to fit here.

Maximise the quality of the data you get out

With unmoderated research, the most important element is the data you get out. We have spoken before about the ways you can maximise the data by minimising biases in your surveys.

There are still big questions about how to get the best data out of a survey and the avoidable variables in surveys can lead to opportunities to improve results.

1️⃣ Put the most important questions at the start
The golden zone of surveys is 3-5 minutes in, so put the most important questions at the start. You read that right. Put the questions that matter the most as close to the start of your survey as possible.

2️⃣ Keep things short, keep things simple
One of the ways to ensure the data you get out of a survey is as accurate and close to objectives as possible is to limit the amount of information given to participants throughout the survey. Every time a participant gets a new piece of information in a survey their mindset changes.

3️⃣ Start with broad questions, then get more specific
Lead people through the research with generic questions, leading to more specific probing. This means participants will only answer questions relevant to their previous experiences and are more likely to stay engaged throughout the survey. This can be solved with conditional logic or by disqualifying participants that are not relevant to the research.

Information, context and the effect that it has

We’ve already covered giving individuals the essential information they need at an important time, but now we’re going to expand on it. Here are some of the ways that giving the right information can maximise survey engagement.

1️⃣ Beware the context you give participants at the start
Not only are you overloading participants with information at this point, but you are also preloading the participants’ brains with unconscious bias. By all means, give people information to get participants engaged and interested, but telling them exactly what you want to discover from the research is also bad practice.

2️⃣ Being vague can help
Follow this with open-ended questions. As long as you dig deeper at some point, you can afford to be relatively vague at the beginning.

3️⃣ Don’t name your company
Maximising survey engagement isn’t just about getting people to your survey, it’s more important to get the right results at the end of the research. Still, there are times when you can bypass this, it is by no means a law.

4️⃣ Think beyond your limited options
“Do you like surveys? Yes or no?” – what may seem like a simple question requires more context in a lot of cases. To answer the question posed above, surveys are great for some things, but terrible for others. Giving participants the right context is key to maximising survey engagement, speeding up completion times and getting clearer results.

Keeping people in the survey

Getting participants to your survey is one problem, getting them to stay until the end is another. If survey completion rate is lower than 95%, there are problems. Here are some of the ways surveys could be improved to combat poor completion rates.

1️⃣ Keep it short
Questions, intros and answers – give the information you need, lose what you don’t need.

2️⃣ Think like a participant
Some key questions the participants are likely to ask include why they should complete your survey or what the task means for them. It’s important to make it clear that upon completion, the participants receive an incentive or are entered into a prize draw (if this is the case with your survey), but also remind them this is their chance to share feedback, that they are being heard and their opinion is essential.

3️⃣ Visually appealing
Design and aesthetics are important in surveys, but keep things simple. The purpose of good design is to make the survey easy to understand and, more importantly, easy to complete. An important note here: just because something is visually appealing, does not make it accessibility-friendly.

4️⃣ Interactive response techniques
Give opportunities for participants to take part regardless of the device they are using, upload photos and videos (if they are comfortable doing so) and edit existing content to remove errors. Little things that increase interactivity can be a positive in surveys (as long as the task remains inclusive and accessible).
 
5️⃣ Engaging questions and techniques
I feel like the last few blogs I have not stopped talking about the need for objective truths, but there is room for subjectivity and engaging questions about users’ experiences.

How can we make questionnaires shorter?

Yes, we’re on to the shortening questionnaires round! If you have ever worked on a survey with 70+ questions that takes participants over 20 minutes to complete, you’ll know that shortening a survey can be a challenging task.

1️⃣ Can we use conditional logic?
Logic might not make survey analysis shorter, but it makes sure that specific questions are only shown to the relevant participants. Rather than telling participants they can skip certain sections, do this automatically with logic.

2️⃣ Is there overlap on questions and how can you combine them?
Questions might be answered twice in similar ways. This is something that in an interview, or in qualitative research, can be twisted to explore an idea further, but with quantitative we don’t have the luxury of asking the same things multiple times.

3️⃣ Surveys are not the place for ‘nice to have’ questions
Are there questions you keep adding to your unmoderated research tasks that you end up never using in tracking or repeat studies? Do you actually analyse this data? Does it tell you anything relevant? The conclusion is clear: if you’re not going to use it, don’t ask about it.

4️⃣ Can the information be sourced from somewhere else?
If the information you are getting from a quantitative survey can be obtained in a different way, then you don’t need to ask it. For example, if this information has already been collected via a pre-qualifying screener that can be matched against this new data (or is available from another source), avoid asking the questions and save this precious resource to obtain information you actually don’t know about.

5️⃣ Chunking
Chunking or segmenting is essentially giving participants multiple logic paths that are followed based on the responses they give at each point. For example, you may have questions relating to beauty products that are tailored to participants based on their previous responses. You can “chunk” these participants into groups and only ask them questions about their product preferences.

Device agnostic surveys

Surveys, card sorts and tree tests should be able to be completed on a range of devices. A lot of promotion for unmoderated research will be done via social media, email and advertising; all of which are accessible on a multitude of devices, with a focus on smartphones. If a survey isn’t optimised for mobile, responses rates will decrease dramatically.

Keep an eye out for:

1️⃣ The length of the intro text
If this looks long on desktop, it’s going to look even longer on mobile.

2️⃣ Device orientation
Be aware of question types like sliders – you might have to sacrifice these non-mobile-friendly questions for options that work better in smaller screens.

3️⃣ Scrolling
On mobiles, users are likely to scroll more between questions and answers.

4️⃣ Matrix grids
This complex-looking type of question is a great option to combine a lot of data into one question, but not great to display on mobile. Be sure to check these before publishing the research.

Participant understanding

When it comes to unmoderated research, your participants must be able to understand it, willing to answer it and able to submit it. This is mostly what you need to run a successful survey, but there are additional parameters to each of these. As long as participants are able to understand, willing to answer and able to share their information and opinions with you, then you’re golden.

How can we do that? Look at improving the following:

1️⃣ Imprecise answers
Giving options like ‘frequently’ or ‘occasionally’ is meaningless. Ensure answers are quantitative like ‘once a week’.

2️⃣ Unfamiliar wording
You will not have the same experience as your audience, so using internal language and abbreviations may confuse participants.

3️⃣ Double meaning
Don’t ask Scottish people ‘how much squash did you consume in the last week?’. For most Scottish people, squash is a sport and not their beloved diluting juice. Think about the participants’ language and remember that, depending on your audience, words might have a double meaning.

4️⃣ Avoid disagreements with negatives
This point refers to opposite ends of your scales. For example, if you are asking how people feel about something, don’t provide options like ‘happy’ and ‘not happy’, as they are not antonyms. A term like ‘unhappy’ will make communication clearer.

5️⃣ Leading questions
They’re bad, right? Only because it plants an idea in the participant’s head as to what is expected from them. There are some examples of leading questions in this previous blog post about bias in quantitative research.

6️⃣ Ask one question at a time
Questions like “was the service you received polite and efficient?” might have two different answers. They are not the same things and should be split into separate questions.

7️⃣ Appropriate time frame to recall accurately
We go over subjectivity vs. objectivity in questions quite often in our content and advice, but giving participants actual timeframes is key to get reliable data. Quantifiable responses require more thought and generally provide better answers.

8️⃣ Awkward generalisations
Sweeping generalisations are a lot like biases and may compromise your unmoderated research results.

9️⃣ Avoid maths and approximations with open questions
Dyscalculia and dyspraxia are real problems that a lot of people around the world deal with in different degrees, and you don’t want to exclude people with this type of impairment from your research. Also, as a general rule, a participant shouldn’t need to do calculations or go through complex thought processes to complete your survey.

Vary question styles

Keeping questions fresh is the best way to keep people in the survey and to maximise engagement, so there are the two top things to look out for.

1️⃣ Beware of over-using question types
Avoid, for example, constantly using radio button scales from ‘A lot – Not at all’ – the participants will get bored! Even varying the options or splitting questions with open text boxes / sorting exercises can lead to a decrease in survey fatigue.

2️⃣ Try a different unmoderated research task type
If surveys are getting a bit too complex, maybe what you need is to swap your questionnaire for a card sort or a tree test. These unmoderated tasks can be the solution you are looking for. Likewise, if you’re running a series, video, audio or visual styles can help with your response rates.

Test different features across multiple devices

We have previously touched on mobile responsivity being important to survey responses. Here is the checklist of what to look out for when designing surveys or unmoderated tasks on a mobile:

1️⃣ Interactive response techniques
Signatures, circling diagrams and spot the differences are great ways to alter response techniques and maximising survey engagement.

2️⃣ Single choice, click on the brand, check the responsivity and accessibility
Using photos and images may increase recognisability of brands. However, these images sometimes stack in interesting ways on mobile applications. Testing this is a must before submitting to your audience.

However, if your participants are not familiar with relevant images used in your survey, this can cause frustration and skew the data. So, when should you use images in surveys? Here are three scenarios where the right image or video will help:

+ To give visual instructions: for example, a short video showing people how to upload a photo from their device to the survey.
+ To generate a connection: using your company’s logo will help with your engagement levels, especially if you are reaching out to your customers.
+ To evaluate a visual product: if you are looking for quick feedback on a design such as a logo or new packaging, providing images to the participants is essential.

3️⃣ Non-directional graphics with icons
Iconography and emojis can be much of the same if used as part of your questions or options. Graphics may not scale properly from laptop to mobile.

Gamifying surveys

One of the elements clients frequently ask us about is gamification as a tool to increase engagement, but what does it actually mean?

Yes, gamifying surveys make participants care more about your research, but it’s not always easy to implement. One of the simplest ways of doing this is with a progress bar, allowing participants to see their evolution. By breaking the progress bars into sections, participants will give more accurate answers and are less likely to rush to complete your survey.

If you have the option to add interactive text to your progress bar such as ‘only three more questions to get to your incentive’, go for it, as it reminds the participant of their end goal and the reward on the other side of the submit button.

Alternatively, giving your audience a suggested time, such as saying ‘look at this design for three minutes’ or ‘this section typically takes five minutes’, may mean participants stick with the task longer.

One of the easiest ways to bring gamification into surveys is by setting a scenario for the participant. There are a few ways of doing this:

1️⃣ Decision-based scenarios
Create a scenario in which they need to make a decision and see which path they choose. An example of this would be ‘If you had to choose a free packet of crisps, which one would you choose and why?’.

2️⃣ Projective scenarios
Get another perspective from participants regarding potential future decisions. An example here would be ‘If you had to choose a book to turn into a TV series, which book would you choose?’. This will generate an open response and requires thought from the participant.

3️⃣ Make it competitive
If you can, and only once the participant has already answered a question, share with your audience how other people feel about that topic. This allows participants to see where they sit compared to their fellow survey participants and makes them curious to answer the next question. However, this is best for non-controversial projects and should only be done when asking about non-sensitive information.

What tips do you have for survey engagement? We’d love to hear from you about your experiences from getting participants to your survey to analysing the data.

 


 

Jason Stockwell, Digital Insight Lead

If you would like to find out more about our in-house participant recruitment service for user research or usability testing get in touch on 0117 921 0008 or info@peopleforresearch.co.uk.

At People for Research, we recruit participants for UX and usability testing and market research. We work with award winning UX agencies across the UK and partner up with many end clients who are leading the way with in-house user experience and insight.