18th March 2020
The idea behind accessibility in surveys is a simple one: the easier your survey is to perceive, understand, navigate and interact with, the more likely the user is to contribute to it. However, accessibility and survey design don’t always work in tandem, with question design options like ‘sliders’ and ‘drag and drop’ features translating into style over substance.
Accessibility and UX design should work closely together, with UX and UI designers ensuring all users have consistent digital experiences that meet their needs. The fundamentals laid out in the WCAG (Web Content Accessibility Guidelines) have helped make the digital sphere into a more inclusive place, but there is still work to be done. This post is about how those guidelines relate to unmoderated online surveys and tasks and how to apply best practice recommendations.
Let’s start with this. When considering accessibility in the digital world there is a distinction to consider – permanent disabilities versus temporary limitations.
Categories of permanent disability:
Examples of temporary limitations:
There are ways of tailoring surveys around each of these factors to make them more inclusive. We can break down elements and improvements to surveys into three simple categories, following the advice and recommendations of the W3C Web Accessibility Initiative: design, content and development.
Disclaimer! There are more design considerations than the ones we are listing in this post, but these recommendations are a good place to start your accessibility journey.
▪️ Ensure there is enough contrast between background and font
So, this may be an extreme example, but reading grey on white is a challenge for a lot of people. Some campaigns may have the option to run on a different background to improve readability for individuals with dyslexia. When this is the case, it is worth testing the contrast between the background and the font for all reading abilities.
The recommendation for text-to-background colour contrast is 4.5:1. If you’re anything like me, judging that ratio is a challenge. So, a tool like Contrast Ratio helps hugely: just put the HEX reference of the background and font colours and the tool calculates the colour contrast.
▪️ Add extra information that doesn’t rely on colour
This example highlights the required fields in red. By adding a symbol to the required fields (in this case, an asterisk) you give the content more context.
This not only makes it accessible for individuals with colourblindness, but also helps individuals with different screen brightness levels and other visual impairments.
▪️ Make links and next steps for participants obvious
This tip will make your surveys easier to complete and ultimately mean participants are less likely to drop out. If there is extra information required or an external link, include the link text in another colour with an underline.
This point, however, doesn’t just apply to links. If you have interactive content like videos or audio files in your surveys, you will need to consider the ease of use of these elements.
Finally, for ‘next’ and ‘submit’ buttons, these elements can be displayed at the bottom of the page in an alternate colour with a progress bar if appropriate.
▪️ Make each survey page consistent
Consistency is key! Each page of your survey will need to be familiar to your participant, and we are not just talking about colour; this goes right the way through to page layout, headings, semantics and image placement on the survey.
▪️ Design for every device (kind of…)
OK, maybe not every device. Participants won’t fill your survey out on their smart fridge, but with wearables and smart TVs becoming more popular, it’s worth testing on some devices you might not have previously considered. Of course, designing for different mobile devices is now an absolute minimum requirement.
One top consideration is where you are distributing your surveys. If you’re sending mobile notifications, you will want to tailor your surveys for a mobile audience: Google Analytics and reporting software will inform you on what devices people use to access your content, so you can better inform the design of the survey.
A device I own that always seems to get missed is a Chromebook; I always get displayed mobile content when using it. With an increase in popularity of smaller screens and touch interfaces, designing in a responsive way is something you need to focus on.
▪️ Give users control over auto-playing multimedia files
Probably my biggest bugbear on this list is videos and sound files that auto-play in browsers. Thankfully, W3C recommends adding controls for content that plays automatically. This is relevant because a lot of qualitative user research may involve showing screenshots and instructional videos to the users, so it’s essential to give them the option to play it in their own time – also, don’t forget to add transcripts or captions to the video.
The copy you write for the survey is as important as the survey itself. Promotional literature, instructions for completion and the questions themselves need to be based on clear communication and consistent language. Here are six important tips.
▪️ Make the purpose of separate sections clear
So, this example may be from a form, but the same principle still applies. Label the progress an individual makes through a survey and add subheadings for sections to break up questions and make the survey easy to follow.
▪️ Link the text that contains relevant information
Write the link text so that it describes the content of the participant’s next step. You can also add more context in the link – i.e. (this will open a new tab) or (PDF, 5MB).
▪️ Make image ALT text relevant to images
People have written ALT text for enough time now to know the importance of it. For screen readers, they need more context, especially with instructional images. Rather than talking about what the image is, move towards talking about what information you’re looking to communicate with an image.
ALT text example:
ALT text for the image on the right might be ‘sitting at a table’. However, this is not enough context if the question references the image and you need someone with a screen reader to be able to fully understand what is depicted.
A stronger example would be ‘A male and a female figure sitting at a table for coffee’. This gives the user more of an indication around what the image is displaying and ensures the data you collect is more accurate and complete.
▪️ Add captions to multimedia elements
When displaying video or audio, ensure the captions used to communicate voices, actions and relevant background sounds work properly. You may want to layer different colours if multiple people are communicating and add emotional context to speech if it is relevant to the survey.
▪️ Make instructions short and clear
And don’t forget to avoid jargon/technical expressions and complex sentence structures.
▪️ Use lists and iconography to strengthen the message
People take in lists and bullet points quicker than paragraphs of text, and icons give your words more context while serving as a visual aid to break down text. Work with both to transform the way your surveys communicate the overall message to the participants.
Finally, we go beyond the content and the design. The final stage of creating your survey is building it, so here are our top tips to help you create an accessible and inclusive product.
▪️ Use labels and input tags
Use the <label> element and link it to the ID attribute in forms as it will give readers context around the form. Here is an example of how you can do it:
<label for=”firstname”>First Name</label>
<input id=”firstname” type=”text” name=”firstname”>
▪️ Error messages and guidance text
The lack of proper guidance and error messages is another infuriating element of form design that has made its way over to surveys. To avoid this, code these explainers at the end of error messages to ensure the participant knows exactly what is required. Take your time to get the placement, language and clarity right.
Here is a great article by HubSpot on why error messages are important.
▪️ Code for multiple devices
There is a reason responsive design is mentioned twice in this article: because it’s so important. The first step refers to designing the survey to work across multiple devices and the second serves to remind you to build your online task with this goal in mind.
▪️ Avoid CAPTCHA if possible
This may seem a little rogue, but CAPTCHAs cause problems with accessibility in surveys. Depending on the platform you are using to run your task, research other options for pre-qualification criteria that aren’t related to identifying traffic lights. This is especially relevant if you’ve invited accessibility participants to fill out a survey or if this is the audience you are looking to target.
▪️ Avoid common tools that don’t meet compliance
User researchers lean towards some tools from a design perspective, but this doesn’t mean they’re easy to use for disabled participants, both the ones who use assistive technology and the ones who don’t.
Sliders, rank order questions and matrix drag-and-drop tables don’t meet the standards set by the WAI and, although they are designed and built superbly, not all users will be able to understand what they are meant to do.
There are a couple of fantastic tools out there to help you test the accessibility of an online survey or task.
🛠 WAVE Web Accessibility Evaluation Tool
WAVE helps authors make their content more accessible to individuals with disabilities.
🛠 W3C Markup Validation Service
The W3C tool checks for more code elements on your survey to ensure the content is valid.
With testing accessibility, it’s worth constantly tracking successes on your surveys, looking at drop-off rates, heatmaps and questions that individuals are struggling to answer. If you have the time and the resources, send your survey to a small group of participants and analyse the results and behaviour captured before distributing it to a bigger audience.
When running a survey, card sort or tree test for our clients as part of our user recruitment service for remote unmoderated projects, we find that having the chance to test an online task with at least three participants generates a lot of useful feedback that improves the final product before it is sent to thousands of people. Once it’s out there, there is not much you can do and the only option many times is to re-recruit – something you should definitely try to avoid for a long list of reasons. If you are at the initial stages of designing your remote unmoderated research tasks, we have a blog on five important things to consider during the planning stage that could be useful.
What about you: what’s your experience with accessibility in surveys? What have you found works and doesn’t work for your audiences?
Jason Stockwell, Digital Insight Lead
If you would like to find out more about our in-house participant recruitment service for user research or usability testing get in touch on 0117 921 0008 or firstname.lastname@example.org.
At People for Research, we recruit participants for UX and usability testing and market research. We work with award winning UX agencies across the UK and partner up with a number of end clients who are leading the way with in-house user experience and insight.