NEWS CASE STUDIES GUEST BLOGS EVENTS HOW TO

NEWS

It’s that time of year again. When the heating is on full blast and we’re at the end of the Christmas cake: this is when trends articles and lists start flooding in, so we thought we should join the wave. Here is our survey trend forecast, a list of what we believe is going to change within quantitative research in 2021.

It’s fair to say that in January 2020 nobody could have predicted we would be where we are now, so don’t hold me to this. We don’t plan on reinventing radio buttons here, but here are six ways surveys are likely to change in the next few months.

🔮 More interactive surveys

With remote research being the primary type of user research happening right now, the focus for the first half of the year (at least) is to make surveys and other unmoderated research tasks more engaging. Tools enabling video questions (and potentially video answers) like VideoAsk help with this.

By recording audio or video questions, quantitative research now has a face and a level of personality to it. There are steps you need to take here to ensure your survey remains accessible: the questions should also be written and subtitles should be provided on the video, but on the whole, it will make surveys more engaging.

Not every participant will be ready to start recording themselves in responses yet. Having done it myself, I can confirm it is quite uncomfortable to see your face staring back at you giving a response (especially if the participants feel like there could be a right or wrong answer, which is a common assumption in research). So, it’s best to give individuals the choice as to how they want to respond for now: via recorded message, video or text.

Of course, this might mean that the analysis of the responses become more challenging for the researchers, but it also provides a whole new amount of data generated by the participants’ body language and facial expressions that could be game changing for some research topics.

🔮 Accessibility takes centre stage

Accessibility should never be considered a trend; the point that needs to be communicated is the importance of accessibility in survey design and communication.

2021 is the year to put accessibility front and centre of all UX projects. With the UK government pushing to make online public services more accessible (mobile apps need to meet these regulations by June), it’s time for survey design to become fully accessible.

The three core areas of focus are the overall design of the survey, the content (including the instructions) and question and conditional logic development.

🔮 Investing in the right tools

I recently spoke to a UXer about how they were getting on with user research for a project, as they were trialling quantitative methods for the first time and wanted to use a tool that would be able to handle large amounts of data. The tool they chose was Qualtrics, a very comprehensive – and more expensive than average – platform.

They described the process as “using a sledgehammer to crack a nut”: a bit of a funny analogy that actually highlighted an excellent point. Selecting the right tool for the job doesn’t necessarily mean choosing the most expensive or most comprehensive option if that won’t tackle your specific challenges or help you achieve your goals. This can be especially confusing when you are not sure what insights you are looking for and what trends you are likely to see within your quantitative research.

If you are at the beginning of your quant path, it might be useful to team up with a third-party agency like People for Research. Our specialised insights team has combed through thousands and thousands of data entries as part of our work with well-known clients and we can not only find the right participants for you, but also help you with the analysis of the data.

🔮 Data analysis for long text answers

This has been a hot topic for a few years now in the user research industry, and I’m sure some user researchers have perfected methods to extract statistical data from free text answers, but it is by no means something that is being done properly across the board.

When you give your participants the option to freely comment on a survey or unmoderated task, gauging your audience’s sentiment can be a struggle. Doing this manually is incredibly time-consuming and using an Excel formula is reliant on individuals:

+ Using the same phrases or phrase structure
+ Spelling words correctly
+ Giving words the same meaning (for example, words like cheap, challenging, simple or interesting have very different meanings for different people)
+ Sharing common experiences and context
+ And more…

It’s a difficult choice to make. On one side, you need to include these open-ended questions because they provide deeper and richer insights; on the  other, they can make it impossible to fully analyse the data collected.

There’s a few things you can do to make this easier for yourself and your team/stakeholders:

+ Make sure your audience is consistent (similar demographics, motivations, experiences, etc.) or, if you have a mixed bag of participants, organise it by segments and collect the data separately.

+ Create response categories ahead of the analysis. For example, if your audience includes a mix of people with different levels of tech-savvyness and you have asked them to comment of the ease of use of your app in an open-ended question, organise them into categories from tech novice to expert.

 

+ You can also go for something a bit more technical and create a coding frame. Here is a good article to help you learn more about this option.

🔮 Face-to-face will be back, but surveys are here to stay

Sure, face-to-face is great, but have you tried surveys?

I’m drawn towards to sound of a radio button and the smell of a spreadsheet, but I understand a lot of people don’t share this feeling. Surveys have had a bad reputation for a long time, but mainly because they have been widely misused and their data poorly analysed.

With more and more researchers starting to see the value of properly planned unmoderated research and high-volume surveys, it’s important to remember that quantitative research can be extremely useful if done at the right time in your product’s life and with the right people (both partners and participants).

🔮 A split in the survey market

When deciding to run a survey, you have three options:

👉 The DIY method
Self-explanatory really. Do it yourself: build the survey, recruit the user and analyse the data.

👉 Self-serve platforms
Submit your questions to a self-serve database, where participants are rewarded with points or a prize draw.

👉 Working with a partner
You decide to work with a partner like People for Research – that can provide assistance with set-up, full user recruitment services, GDPR and admin support – and remain fully in charge of your research, but without the time-consuming hassle of sourcing the participants and dealing with incentive payment and other details.

These options won’t change much, but we will see them become more siloed. In 2021, surveys will become even more focused on finding the right people before the project, to ensure participants don’t rush through tasks to get their points, access to prize draw or incentive.

Each of these three options has its place and market. DIY is great for university projects, product reviews and referrals, but you’ll struggle to reach people outside of your circle (whether that is friends, fellow students or users of your product).

Self-serve is a good option for high-volume surveys, but there is limited validity of data or access to information about your audience. Unfortunately, in these platforms, participants tend to rush through the tasks on offer to gain points or prizes.

Working with a partner is the most comprehensive option and one that works for everyone, but comes with additional costs. However, unless you spend a lot of time growing your own database of people and keeping them engaged, achieving the same results via the DIY option will be extremely challenging. On this note, click here to find out more about how People for Research fits in this category and what we can offer.

As surveys develop further into an increasingly useful tool for user researchers, more trends and lists like this one will surely pop up around the web. What about you: what do you think is going to change in the world of surveys in 2021?

 


 

Jason Stockwell, Digital Insight Lead

If you would like to find out more about our in-house participant recruitment service for user research or usability testing get in touch on 0117 921 0008 or info@peopleforresearch.co.uk.

At People for Research, we recruit participants for UX and usability testing and market research. We work with award winning UX agencies across the UK and partner up with a number of end clients who are leading the way with in-house user experience and insight.