Perform a Live Usability Test

Overview

User testing is a practice that should be at the heart of any Cantilever design process. In the modern era we have the ability to reach testers from around the world instantly. It doesn’t have to cost much money or time, and can reveal critical problems before we allow them to see the light of day. In UX one can never truly know that a certain solution will work. Hard quantitative evidence is critical, but so is subjective, high-touch feedback from real users. We are as interested in what people think of the page – or what they think the page actually says – as what they do on the page.

We do two modes of user testing. One is Live User Testing (which is what this procedure covers). The other method we use is Service-Based User Testing , in which we use a service like UserTesting.com to run and record the tests. Live User Testing has the advantage that we actually meet the person, and get to hear more about their inner thoughts while using the site, but it takes longer to coordinate and run the tests.

Our user testing approach is based on the work of Steve Krug, a usability expert, author, and speaker. He wrote the seminal Don’t Make me Think, which we suggest that anyone involved in Cantilever design take the time to read.

With his approach, you take a decidedly non-scientific approach to user testing:

  • Get random people to look at your site (or app, or comps, or wireframes, or even pencil sketches)
  • Ask them what they see. When they respond, keep asking for clarification and insight on how they perceive the page
  • Ask them to perform certain actions (or tell you what they would click to perform a given action). Ask them what they expect will happen.

Read the book. While the book is in the mail, watch these videos 😀Here is a demo test:

Here is a lecture by Steve in which he does a demo test, but the video doesn‘t work out. So skip that part, but watch the rest:

Since the tests are recorded, you don’t need to take notes during them and can focus on getting the user talking as much as possible. After the test, you can take notes on the recording. From those notes, construct a set of findings for the design team to consider. Don’t worry about making specific recommendations, just about outlining the current status of things as well as you can.

During the test, focus on keeping the user talking. The more they tell us about the experience from their perspective, the more we will learn. This is a great rundown of questions that are helpful to ask: sensible.com/downloads/test-script-web.pdf

What you Need

  • Information on how many tests we need, and what kinds of people we are looking for. We like to use a blend of people who understand the subject matter of the website well, and total strangers with no specific knowledge. We do not like to use company employees or people who understand web design principles in general.
  • Something to test. We test everything from finished, established sites to early prototypes and sketches. Modify the steps of the test to suit the specific situation.
  • Test Prompts. These should be written by the PM or design team in advance of testing.
Orientation Script

Hi, [Subject].

My name is [Tester], and I’m going to be walking you through this session today.

Before we begin, I have some information for you, and I’m going to read it to make sure that I cover everything. You probably already have a good idea of why we asked you here, but let me go over it again briefly. We’re asking people to try using a Web site that we’re working on so we can see whether it works as intended. The session should take about half an hour.

[If this is prototype, not a real site, explain that now. Tell the user how "finished" the prototype is (does it have real colors, imagery, fonts, etc?) Tell them that they might try to click parts of the prototype which are not yet linked, and that’s OK. You can guide them in the right direction.]

The first thing I want to make clear right away is that we’re testing the site, not you. You can’t do anything wrong here. In fact, this is probably the one place today where you don’t have to worry about making mistakes! As you use the site, I’m going to ask you as much as possible to try to think out loud: to say what you’re looking at, what you’re trying to do, and what you’re thinking. This will be a big help to us.

Also, please don’t worry that you’re going to hurt our feelings. We’re doing this to improve the site, so we need to hear your honest reactions.

If you have any questions as we go along, just ask them. I may not be able to answer them right away, since we’re interested in how people do when they don’t have someone sitting next to them to help. But if you still have any questions when we’re done I’ll try to answerthem then. And if you need to take a break at any point, just let me know.

You may have noticed we are recording this call. With your permission, we’re going to record what happens on the screen and our conversation. The recording will only be used to help us figure out how to improve the site, and it won’t be seen by anyone except the people working on this project. And it helps me, because I don’t have totake as many notes. Do we have your permission to record?

Steps

Construct a list of participants, working with the client or project manager, depending on who we are testing with.

Work with the client to determine what kind of reward to provide to testers. Sometimes we provide a $25 Amazon gift card. Sometimes the client thanks participants in another way.

  • For each participant, set up a test appointment with Zoom call. Give them an overview of what the test will entail, and information on the reward. The appointment should be 30 minutes.
Run the live test (during the zoom call).
  • Greet the tester and make sure they are ready to begin
  • Make sure to record the session in Zoom.
  • Ask the tester to share their screen and navigate to google.com in a web browser
  • Read the tester the Orientation Script (Below)
  • Run through each step in the test script.
    • If they get stuck on a step and can’t figure out how to continue, give them some time to figure it out. If they really get stuck, help them find the right link/button to continue the exercise.
    • If they start doing lots of steps without talking to you about it, remind them to tell you what they see each time they scroll around the page or hit a new page. Keep them verbalizing their actions.
  • If you start to run out of time, no problem. Don’t go over 30 minutes, in order to be respectful of the tester’s time.
Review the recording and take notes in the relevant team drive.

Review the recording and take notes, separated by test prompt. Try to summarize what the user told you, and your abstract sense of what they were thinking, into clear unified statements about how the user did during a prompt and what they perceived. Your notes should be, at most, 2 pages of text (in Google Drive), around 500 words.

  • Distribute the notes to the team.
  • Issue the appropriate reward to the user.