Accessibility Audit

Anticipated Duration (First Time)
2-8 hours
Anticipated Duration (Ongoing)
1-4 hours
🚧
This is a new procedure and will be subject to change and interpretation. Please feel free to use your best judgment when applying it, knowing the end goals

Overview

While accessibility is an amorphous concept, the Web Content Accessibility Guidelines (WCAG) provide a rubric of commonly-accepted accessibility measures which are a good starting point. If a site is compliant with WCAG’s ”AA” standard, it is probably reasonably accessible for most users. It is important for us to verify that this is the case on our own sites. We also want to be able to identify for other people what a site’s accessibility level is.

In some cases, the client may wish to deviate from the AA standard to achieve a specific design goal. But at the least, we should consistently know which parts of the standard a site is not meeting.

We use the axe Chrome extension for automated testing. The axe extension fixes low-hanging-fruit problems that can be automatically detected. Manual testing follows up on axe reports and does the hard work of making sure that a real user's experience will be positive.

For manual testing, we have a customized checklist for checking WCAG AA compliance (). It is heavily modified from earlier work by Yale University. Criteria are phrased as pass/fail questions, with the intention of guiding testing and being easy-to-understand for inexperienced auditors.

Who can do this

You need some degree of technical knowledge in order to check the nuanced rules. Many rules are also best checked through the codebase, not through from the frontend. There are plenty of rules that can be checked without that knowledge, though. If you are a developer tasked with this procedure, you may be able to delegate some of the less-technical checks to a colleague.

Use of a screen reader, specifically NonVisual Desktop Access (NVDA) on PCs, is involved. NVDA shortcuts are provided in the procedures when necessary. However, regular auditors are expected and encouraged to become comfortable with screen readers through use and practice. A cheat sheet of NVDA shortcuts can be found here.

What you need

  • Knowledge of which pages should be focused on during browser testing. This will be referred to as the "Focus Set"
  • A task in Asana to log content & design problems for your PM.
  • The
  • Use of Browserstack may be necessary for touch & mobile
  • NVDA screen reader installed on PCs.
  • ⚠️
    This procedure and keyboard shortcuts described assume a Windows user, using the NVDA screen reader with Google Chrome
  • Extensions:
    1. ⚠️
      These extensions can see your browsing data, so consider setting up a new user profile in your Chrome installation in order to add them, or install them just for the test and uninstall it quickly after.
    2. axe Chrome extension installed.
    3. Web Developer Chrome extension installed.

Steps

Setup

💡
Since many site elements are shared between pages (referred to as "Global Components" in site documentation), you may notice "repeat offenders" - a particular element for a particular issue appearing over and over in axe's reports. You only need to report an element once per issue.

Before you start, create a new audit report in the table, using the template.

Automated Testing

⚠️
This section was written for axe Chrome extension version 4.6.2.
  1. Ensure "Best Practices" are enabled in axe's settings.
  2. For each page in the Focus Set, run the axe extension, using "scan all of my page".
  3. Axe sorts violations in five categories: Critical, Serious, Moderate, Minor, and Review.
  4. Report all violations in Critical, Serious, Moderate, and Minor.
  5. Manually inspect each issue in Review, reporting or dismissing as appropriate. For instance, color contrast violations may have been deemed acceptable in an earlier audit. If you are unsure, err on the side of reporting.

Manual Testing

📖
Use the checklist's questions to further guide your testing. As you confirm standards are being met, check each checkbox on the checklist. Report any violations.
💡
Be willing to report frustrating experiences, even if the content technically passes scrutiny.

Testing Keyboard Operation

  1. Start tests with your keyboard's focus on the address bar/URL bar of your browser.
  2. From there, make your way through the page, only pressing the Tab key. Make your way backwards through the site using Shift + Tab.
  3. Note appearance of the keyboard focus highlight. Note which elements receive keyboard focus, and which order. Check that all focusable elements receive focus.
  4. Check all interactive elements. Check that elements with items inside of them, like drop-downs, can be operated by keyboard: arrow keys to cycle through items inside, enter or space to confirm item.
  5. Check elements that produce new content, like dialogs or modals. Ensure focus moves to new content, focus remains strictly within dialogs, and that content can be dismissed.

Testing Accessible Text

Testing Image Alt Text Quality:
  1. Open the Web Developer extension
  2. Select Images -> View Image Information
  3. Using the questions in the checklist as a guide, verify each image has appropriate text alternatives as necessary.
  4. Select Images -> Outline Background Images
  5. Locate background CSS images. Verify they're purely decorative, or have an appropriate text alternative.

Testing Link Text Quality:
  1. Open NVDA
  2. Press Insert + F7 to generate a list of elements. View all links.
  3. Check if the link text adequately describes its purpose out-of-context.
    1. If a link's text by itself does not convey purpose, check if tabbing to it announces context.
    2. If tabbing does not announce contextual information, check the nearby context of the link, such as sentence or paragraph text, list item, or table header.

Testing Presentation of Content

Common Component Testing:
💡
While templating means inconsistencies across pages isn't likely, it's still fine practice to closely examine common components in general. If a common component is frustrating to use accessibly, it is worth reporting that experience.
  1. Check for a "skip to main content" link on each page. It should be the first focusable item on the page. You may have noticed it during testing keyboard operation.
  2. Look for components that are repeated across pages. Examples being page headers and footers, navigation menus, search fields, and more. Access to the site templates can help with this.
  3. Confirm navigation consistency. Navigation should be in consistent places, and link lists in the same order, on every page.
  4. Verify that each component, and the elements that comprise it, have a consistent function and consistent label, name, or text alternative.
Testing Specific Content, Graphics:
  1. Using the Web Developer extension, select Images -> Outline All Images
  2. Using the image outline as a guide, carefully examine the page.
  3. Check that none of the images contain text as content.
  4. Examine any graphics and widgets for automatic moving, blinking, scrolling, or otherwise animating. Verify that animating items, if animating automatically, have a way to pause, stop, or animation.
  5. Examine pages for flashing content. Count flashes per second. Verify less than three flashes per second.
  6. Confirm that, in general, no information is conveyed soley through icons or symbols and that text alternatives are available. Confirm screen reader accessibility with NVDA if necessary.
Testing Specific Content, Text:
  1. Carefully read through the page.
  2. Look for places on the page that use sensory characteristics: describing colors, shapes, sizes, or other visual characteristics of an item.
  3. Confirm that the information is also available in text.
Testing Specific Content, Hover and Focus:
💡
Note that any content that only appears on hover specifically is a failure of Keyboard Operations and should be reported as such
  1. Look for all instances where content appears on focus or on hover.
  2. Confirm that hover content is dismissable with the escape key without changing hover or focus.
  3. Confirm that hover content can have the pointer moved to the content without the content disappearing.
  4. Confirm the content remains until dismissed, focus changes, or the content loses relevancy.
Testing Specific Content, Hover and Focus:
💡
Note that any content that only appears on hover specifically is a failure of Keyboard Operations and should be reported as such
  1. Look for all instances where content appears on focus or on hover.
  2. Confirm that hover content is dismissable with the escape key without changing hover or focus.
  3. Confirm that hover content can have the pointer moved to the content without the content disappearing.
  4. Confirm the content remains until dismissed, focus changes, or the content loses relevancy.

Testing Landmarks, Headings, Semantics, and Source

  1. Open NVDA.
  2. Generate a list of elements with Insert + F7. Click to show the list of landmarks.
  3. Landmarks should be present, descriptive, and cover all regions of the page. Axe automation should note content outside of landmarks.
  4. The list of elements can also show the list of headings.
  5. Confirm headings are present, descriptive, and form a general outline of the page content.
  6. Search the page for any data laid out as a table. Confirm use of HTML table elements.
  7. Open the page's source code. You can also use Inspect Element for the following steps, or view templates in a repository.
  8. Skim through the source code. It should read and be ordered in a logical manner.
  9. Confirm navigation links are marked semantically using a <nav> element or role="navigation" property.
  10. Search for frames and iframes and check for descriptive titles. Keep in mind that iframe titles are often beyond our control to properly address.
  11. Search for article and section elements and confirm heading elements in them.
  12. Search for div elements and confirm they have no semantic meaning.
  13. Inspect links and buttons, confirm use of real <a> and <button> elements as appropriate.

Testing Forms

  1. Check the label for each input of the form.
    1. Labels must have a visual component. An icon with alternative text is acceptable.
    2. Placeholder text in the form is, by itself, insufficient labeling and should be reported.
    3. Groups of labels need an additional programmatic association for the entire group.
  2. Check the form for required fields.
    1. An asterisk (*) alone is barely passing.
    2. It is okay if the form marks optional fields instead. The same standards apply.
  3. Check form instructions. Forms should describe required input, and any inline help should be programmatically associated with fields using aria-describedby.
  4. Open NVDA. Browse through the form. Ensure you hear all instructions and labels for the form. You should be able to understand and fill it out with only the information provided by NVDA.
  5. Intentionally submit the form with invalid data. Check the form's error handling.
    1. You may want to do this several times to check various errors.
    2. Screen readers should immediately read error feedback as well. Focus may be shifted to the first erroneous field, or a link to said field provided.

Testing Widgets and Operation

💡
These criteria are more technical and implementation-focused. Less technically-knowledgeable auditors may need assistance here.
  1. Identify if the page has any content with a time out function. Check time limit and possibly of extensions.
  2. Check page controls that cause irreversible actions using your mouse. Check for the following:
    1. If you click the control, it only triggers on mouse release - the up-event.
    2. You can reverse or abort the trigger by moving the mouse outside of the control before releasing.
    3. There is a prompt allowing you to confirm or abort the control.
  3. Using your keyboard, tab through the page. Check for any involuntary change of context, like a major change to the page or changing your focus.
  4. Check the page for status messages. A status message is a message that informs of the success or failure of an action. See here for examples of status messages.
  5. Open NVDA.
  6. Activate the status message and confirm NVDA reads it on change.
  7. Check each UI component with NVDA. Confirm NVDA reads the name and role of each component, and its current value if any. Validate that any custom components are fully compatible with NVDA.

Testing Dynamic Content (TBD)

Testing CSS & Styling Practices

⚠️
While manually inspecting the CSS itself is beyond the scope of this test in particular (as it likely should be during the design phase), some tests for zoom and responsiveness are possible that can point to potential CSS problems:
  1. Set the browser's viewport to 320 pixels wide. Confirm site is responsive and shows all content and functionality at this width without horizontal scrolling.
    1. Some content is acceptable to require horizontal scrolling, such as maps or tables.
  2. Resize the browser's viewport to a desktop resolution. 1366x768 is recommended.
  3. Use the browser's zoom function to zoom the page to 200% size.
  4. Comparing the page to the default, 100% zoom, confirm if all content and functionality remain.
    1. If content or functionality is lost, confirm if it can be accessed with one click.
    2. Horizontal scrolling is acceptable, but not desired.

Testing Mobile & Touch

💡
'Controls' here refer specifically to site-based controls, not your browser or phone's OS. Violations here are expected to be rare.
  1. Open the web page on a mobile device.
  2. Ensure the page functions properly in both portrait and landscape orientations.
  3. Confirm all functionality that requires a swiping or multi-point gesture has a single-point alternative.
  4. Confirm either the lack of motion-based triggers, or that such triggers can be disabled or have an alternate control method.

Afterword & Miscellaneous

The WCAG, and this procedure, are fairly involved and implementing accessibility can be tricky in the best of times: we are here to audit and report, but we should also advise and help the rest of the team by suggesting accessible solutions when possible.

It is understandable that time, budget, and other factors can make correcting every violation impractical or impossible. At the same time, violations should be acknowledged at a minimum, even if ultimately not corrected. If a violation cannot be corrected, it is worth noting that down for review. Perhaps a new practice can prevent it from occurring in the future.

Some criteria, especially with multimedia, may be violations that we have little to no control over. Some major examples here would be iframes, embedded videos lacking captions, or a podcast site lacking transcripts. While violations in these areas can be noted and reported, inability to correct them is understandable.

The CMS should be developed with adding accessible content in mind. Auditing the CMS itself for accessible content creation is beyond the scope of this procedure, but is not necessarily off the table. Particular violations occurring repeatedly, such as images lacking alt text, may be a sign of the control panel lacking accessible content fields.

Creating Tasks

Once you create a WCAG audit, the resulting spreadsheet of issues is sufficient to pass along to a project manager and development/design team for remediation. However, it can be helpful for the auditor to provide clarity on the ideal next steps for the remediation. To do this, the auditor can also create tasks in our PM tools to drive the team towards specific actions.

We like to end up with 10-15 tasks per round of remediation. They can cover multiple WCAG violations, for example all link text can receive the same contrast fix at once even if there are multiple instances of the same violation in the report. Or a given violation may require multiple tasks. You can assign all the tasks to the project Strategist in the "To Clarify" status.

The tasks should be titled as a "problem to be solved" ex. "Contrast on X page hero is not conformant". They should include a "Problem statement" that describes what is wrong, and any "Solution Ideas" you may have. I will fill in the rest of the tasks.

All the tasks you make can go into "On Deck". The strategist will review them and pick some to move into "Current Sprint".

Please choose the work that you think:

  • Has major impact for any individual user (anything that would utterly break a page for a given individual) or has the broadest impact (something that affects user experience for a wide swathe of people)
  • Can be done within the design constraints we have. If we have full control over the design, there are no limits here. However if we are working with a client design with a lot of stakeholders we might only be able to do accomplish limited changes to the design.