This post is the first of a series focused on our web accessibility initiative. As a company that develops software, including websites, KPR has a responsibility to make sure that those sites and applications are accessible and usable to all users, regardless of disability. And, given the focus of our work on enhancing accessibility for computer users with disabilities, we have an extra imperative to get our web accessibility house in order. We’ve learned a lot through this process and hope that sharing some of those lessons here might help others who want to do something similar.
The first step in the process was to define a procedure, a recipe, that we could use to test the current state of accessibility for all of our websites. This needed to be something we could do ourselves (DIY) in a reasonable amount of time while still yielding good information about accessibility problems. I’ll share our DIY accessibility testing recipe in this post, and we’ll look at the accessibility problems we found and how we fixed them in future posts.
Why accessibility testing now?
I wish I could say that this was part of a long-standing and well-organized process, something like an annual checkup to monitor and enhance accessibility each year. But the truth is that KPR really had never done a thorough accessibility evaluation of our websites before. I’d checked a few of the basic issues now and then (like alt text, contrast ratio, or keyboard access), but largely trusted that simple coding practices and limited use of 3rd party markup would be “good enough.” It’s embarrassing to admit that, and I sincerely apologize to any users who had difficulties using our websites because of inattention to accessibility details.
One reason why I finally made this a priority may be attending a TeachAccess “accessibility bootcamp” a few months ago. This summer, when I had the opportunity to hire a student intern for a few weeks, the accessibility topics from the workshop were fairly fresh in my mind. I decided to use the additional help to tackle accessibility in a more systematic and thorough way, at long last.
The DIY Accessibility Testing Recipe
This recipe evolved from the following goals:
- Test for compliance with WCAG 2.0 Level A. The WCAG (or Web Content Acccessibility Guidelines) are published by the W3C, the main international standards organization for the Internet. So it makes sense to start there.
- Cover all content in the website.
- Could be followed by an undergraduate summer intern with no previous accessibility experience, with some training and supervision.
We based the recipe on some of the suggestions in the TeachAccess workshop I mentioned above, as well as checklists like this one from the Illinois Dept of Human Services and especially the “wuhcag” website, which presents a very practical approach to meeting the WCAG 2.0 guidelines.
We used the checklist from the wuhcag site to determine testable criteria for each of the 25 guidelines. Then we listed all those criteria in a google spreadsheet. We’ve shared the spreadsheet, so you can see exactly how it’s set up. You may want to refer to it while you read about the recipe below. (Spoiler alert: the spreadsheet also shows the accessibility testing results for two of our websites.)
Gather testing tools
This recipe relies on some automated testing tools as well as human (manual) testing. For automated testing, we used the following tools:
- W3C unified validator: This combines an HTML checker and CSS validation.
- W3C HTML Checker: This is an HTML checker only. We use this when the unified validator gives errors for SVG properties within the CSS files.
- WAVE web accessibility evaluation tool: An automated testing tool that includes, in the words of its developer WebAIM: “as many tests for true accessibility as we can think of, including many checks for compliance issues found in the Section 508 and WCAG 2.1 guidelines.” Available in several forms; we installed it as a Chrome extension for our testing.
For the manual testing, you’ll need a screen reader. We performed our testing on a MacBook Pro, which has the VoiceOver screen reader built in. For Windows, you will have to decide what screen reader you want to use. You may opt for the Narrator built-in to Windows 10, or NVDA is also a free option. Jamie Pauls wrote a review that may be helpful in making this decision. We’ll refer to “VoiceOver” in our recipe below, but substitute whatever screen reader you are using.
If you aren’t already experienced using a screen reader, you’ll need to get up to speed with the basics of operation. We relied heavily on WebAIM’s article on VoiceOver to learn the basics and as a reference to keyboard commands. Spend a bit of time using the screen reader to read a straightforward website before beginning your testing.
Quick Check on the Home page
As the first part of the accessibility testing, conduct a quick check on the Home page. If the page can’t pass these screening steps, you may want to address the issues identified before proceeding with more detailed testing. The quick check involves 4 steps:
- Run WAVE on the Home page to determine a baseline of errors, alerts, contrast errors, and other issues.
- Use WAVE to view the Home page without CSS styles and ensure that there is still a logical order to the page content.
- View the Home page in grayscale to visually check that the content is still visible.
- Read the Home page using VoiceOver to ensure that it starts talking at a sensible place and covers everything on the page.
Detailed testing for every page
After the quick check, you’re ready to do more detailed testing. You’ll repeat these tests for every page of the website. Start with automated testing using the W3C validator to check for things like correct HTML markup.
Then run the WAVE extension on the page and record the number of errors, alerts, and contrast errors. Use the detailed tab of the WAVE window to get information on the types of errors, if any, and suggestions for how to fix them. (We’ll have more about fixing accessibility issues in future posts.)
The final stage is to perform manual (human) testing on every page for all of the WCAG Level A guidelines. Here’s where the spreadsheet with testable criteria comes in handy. If web accessibility is new to you (or the intern running the tests), the information on the wuhcag site is very helpful for understanding the meaning of each guideline. Once you’ve learned some basics, it’s quite straightforward to repeat the process for all the remaining pages on your website.
Use this handy template
Our test results spreadsheet shows how we set up our tests and recorded their results. The tabs labeled ‘KPR site’ and ‘AT-node’ show you the results for two of our websites, with problems that need further attention generally highlighted in yellow.
We’ve shared a blank template as well, which you can use as the basis for your own tests. To use this yourself, you’ll need to get an editable copy. You can do File/Download to an .xlsx file, then either open it in Excel or reopen the .xlsx file in a new google sheet that you create. (If you have any questions, or if anyone has a better way to share templates like this, please let me know!)
Once you have an editable template sheet, just replace the column headings in the Detailed Testing section with the pages from your own site. Work your way through the tests in the template, recording the results as you go. And that’s it!
Strengths and limitations
This is a fairly informal test procedure, but it did help us efficiently identify a number of important accessibility issues. Equally important is the fact that a novice intern could run the testing in a repeatable, accurate fashion after an hour or so of introduction.
Achieving WCAG 2.0 Level A is only a starting point, and certainly the next steps would be to add the criteria for Levels AA and AAA to the recipe. The current template supports that, and the wuhcag site provides similar guidance for those higher levels. Some Level AA and AAA items are tested for with WAVE, so an absence of WAVE errors means that we also have a degree of Level AA and AAA compliance. But we didn’t do any specific human testing for those higher levels.
Another limitation in our human testing is that we did not involve anyone who actually relies on accessibility features like VoiceOver or keyboard-only navigation. Particularly for screen reader use, it can be hard to know for sure whether the screen reader is saying the “right” thing. Our human tests are a good start, but not as good as if we had experienced users do the same tests.
Overall this recipe moved us further down the road in the quest for full web accessibility for all KPR websites. It’s not perfect, and it’s not a substitute for a more thorough audit by an outside accessibility team such as Deque Systems, which would be a great next step. However, it’s a strong beginning that proved to be practical to implement. If you have suggestions for improving this recipe, please let us know!
Other resources for accessibility testing
In addition to the resources already mentioned, there are other checklists that we found later in the process, which you might find useful. These cover a lot of the same material, as you’d expect, but do have some differences. Take a look at the a11yproject checklist and A List Apart’s web accessibility blueprint.
Do you have an accessibility testing approach that works for you? Does this recipe seem like it might be useful?