Perform a Live Accessibility Test

This procedure is adapted from:

Perform a Live Usability Test

Please read through that before performing this test, unless you are already familiar with our method for running usability tests.

Accessibility is a special category of usability that covers cases where the user has a condition that affects their ability to use a computer. This obviously includes permanent disabilities like blindness or motor impairments, but also temporary disabilities like a broken hand or “situational” disabilities such as holding a baby and therefore having only one hand free, or sitting in a sunny room which affects the visibility of the monitor’s display. For more on the categories and prevalence of disabilities for internet users, see this report from the W3.

Cantilever is focused on delivering Digital Hospitality for any user regardless of their situation, so we are beginning to test explicitly for accessibility. This guide covers how we should undertake such tests. Accommodating for people who use keyboard navigation is not too tricky, since we can and do test for keyboard navigation pathways, but when it comes to users who use screen readers or alternate input devices, it’s harder for us to truly understand their experience. While we can use a screen reader ourselves, we are not adapted to using them regularly, and our usage patterns and expectations would be different from an actual daily user. We believe this is a great usage of human testing, because we can get direct feedback not based on a ruleset or "best practices" but on the tastes of the actual human beings we want to serve.

What you Need

  • Information on how many tests we need, and what kinds of people we are looking for. We should aim to have people with a variety of levels of visual impairment, and ideally at least one user who uses an alternate input device, but just testing with visually impaired users will expose a lot of problems.
  • Something to test. We test everything from finished, established sites to early prototypes and sketches. Modify the steps of the test to suit the specific situation.
  • Test Prompts. These should be written by the PM or design team in advance of testing.
Orientation Script

Hi, [Subject].

My name is [Tester], and I’m going to be walking you through this session today.

Before we begin, I have some information for you, and I’m going to read it to make sure that I cover everything. You probably already have a good idea of why we asked you here, but let me go over it again briefly. We’re asking people to try using a Web site that we’re working on so we can see whether it works as intended, especially for people who [use screen readers/use alternate input devices/etc]. We want to see you use the site like you would use any website, try to perform some tasks, and give us your feedback. The session should take about fourty-five minutes.

[If this is prototype, not a real site, explain that now. Tell the user how "finished" the prototype is (does it have real content, etc?) Tell them that you can guide them in the right direction.]

The first thing I want to make clear right away is that we’re testing the site, not you. You can’t do anything wrong here. In fact, this is probably the one place today where you don’t have to worry about making mistakes! As you use the site, I’m going to ask you as much as possible to try to think out loud: to say what you perceive the page to be saying, what you’re trying to do, and what you’re thinking. This will be a big help to us.

Also, please don’t worry that you’re going to hurt our feelings. We’re doing this to improve the site, so we need to hear your honest reactions. We are especially interested in getting your personal opinions about how you prefer websites to be organized and what makes you feel comfortable when you are using a site.

If you have any questions as we go along, just ask them. I may not be able to answer them right away, since we’re interested in how people do when they don’t have someone sitting next to them to help. But if you still have any questions when we’re done I’ll try to answer then. And if you need to take a break at any point, just let me know.

We are recording this call. With your permission, we’re going to record what happens on the screen and our conversation. The recording will only be used to help us figure out how to improve the site, and it won’t be seen by anyone except the people working on this project. And it helps me, because I don’t have to take as many notes. Do we have your permission to record?

Steps

Construct a list of participants, working with the client or project manager, depending on who we are testing with.

Work with the client to determine what kind of reward to provide to testers. Sometimes we provide a $25 Amazon gift card. Sometimes the client thanks participants in another way.

  • For each participant, set up a test appointment with Zoom call. Give them an overview of what the test will entail, and information on the reward. The appointment should be 30 minutes.
Run the live test (during the zoom call).
  • Greet the tester and make sure they are ready to begin
  • Make sure to record the session in Zoom/Skype.
  • Ask the tester to share their screen and navigate to google.com in a web browser
  • Read the tester the Orientation Script (Below)
  • Run through each step in the test script.
    • If they get stuck on a step and can’t figure out how to continue, give them some time to figure it out. If they really get stuck, help them find the right link/button to continue the exercise.
    • If they start doing lots of steps without talking to you about it, remind them to tell you what they perceive each time they use the page or hit a new page. Keep them verbalizing their actions.
  • If you start to run out of time, no problem. Don’t go over 45 minutes, in order to be respectful of the tester’s time.
Review the recording and take notes in the relevant team drive.

Review the recording and take notes, separated by test prompt. Try to summarize what the user told you, and your abstract sense of what they were thinking, into clear unified statements about how the user did during a prompt and what they perceived. Your notes should be, at most, 2 pages of text (in Google Drive), around 500 words.

  • Distribute the notes to the team.
  • Issue the appropriate reward to the user.