Automated QA for accessibility is not enough

Accessibility testing is 30% automated and 70% manual. There is a myth that once accessibility testing is done with automated tools then the accessibility testing is complete.

You can build websites showing zero accessibility errors and warnings but they could be very hard to read for people without disabilities. Let’s take a look at CSS Zen Garden website. When you test the code with automated tools such as wave.webaim.org, you will notice that the tool shows zero validation errors and zero alerts. However, if you try to view the website, you will find it hard to read. So, don’t just rely on automated tools but always do a manual QA. Report showing zero accessibility errors and alerts

Screenshot of visually inaccessible page
This website has zero accessibility errors or alerts as far as the code structure is considered. However, the contrast ratio is very poor making it hard to read. Such cases need human’s manual evaluation to QA websites.

Recommended read:
Accessibility QA best practices (coming soon)

Another example is the reading language. According to the Reading Level rule of WCAG, the content should be written as clearly and simply as possible. There are tools that can help you determine the readability score such as Yoast SEO on WordPress websites but manual checks are necessary there as well because the machine still doesn’t really know the actual intent as yet.

Syrian refugee girl playing jump rope in playgroundYet another example is alt text, an automated tool will tell you an alt text is present but it won’t be able to tell you whether it is correctly describing an image or if it should be empty alt text such as for a decorative image. I have seen developers simply copying the title of the section (where the image is used) as an alt text for the image which makes the assistive technologies read the text twice and is annoying to the users using ATs. For example, in the accompanying image here, a good alt text could be “Syrian refugee girl playing jump rope in playground” but I would not be surprised developers using alt text such as “Syrian Refugee Girl” which is not incorrect but it doesn’t really give a clearer image of the photo to those using assistive technologies. Our goal here is to unify the experience for all users in trying to make it better.

Automation should be a part of the overall accessibility QA process

You can and should always use automation tools to expedite the accessibility QA process to find accessibility errors such as code structure, missing alt text, color contrast ratio, aria label missing, etc. but please do not rely on automation tools alone for testing. Maybe in the future the automation QA tools will become the expert but as of today you have to rely on manual testing to ensure that your website is accessible.

You need to remind your teams involved in producing designs, content, and code about the accessibility best practices every now and then such as through regular training, events, fun-based emails, etc. I will be discussing some ideas in my coming soon post continuous accessibility education.

If you have any thoughts or comments, I would love to discuss!

Leave a Reply

Your email address will not be published. Required fields are marked *