User Experience

Usability Testing: How to Get Useful Results

I recently posted on the difference between Focus Groups and Usability Tests. Today, I’ll outline some tips to ensure your Usability Testing sessions reveal valuable insights.

Put users at ease

It’s simply not natural to work at a computer while a stranger sits beside you asking questions. So making subjects feel comfortable is challenging. But it’s critical: Flustered or self-conscious users don’t make good subjects!

Open by explaining how the tests will work. Steve Krug has a very good introductory script you can use.* The most important points to communicate are:

  • We’re testing a website to see if it works as intended.
  • We’re not testing you. There are no wrong answers.
  • Try to think out loud as you work through the tasks.
  • We need your honest responses. Don’t worry about hurting our feelings.


You’ll also want to explain some logistics, like what’s being recorded, who’ll see the recordings, and what they’ll be used for.

Other pointers include:

  • Dress and act casually.
  • Be friendly and relaxed, not authoritarian or robotic.
  • A little levity can cut the tension. Crack the odd joke if it’s in your nature.
  • Start with some easy warm-up questions, to get subjects talking. Then once the actual tests begin, start with an easy task to build subject confidence.
  • Unless the client insists on video of the subject’s face, eliminate the camera. It can make users nervous.
  • If a user can’t perform a task, ensure they know it’s the website’s fault, not theirs.
  • Remain positive throughout. Don’t show frustration if users can’t complete simple tasks or if they ask “stupid” questions. Always act like everything’s going well.

Remain unbiased

Your goal should be to find usability problems, not to prove that a site has poor (or great) usability.

Be careful how you word your tasks. Don’t don’t give subtle hints on how to complete them, or put words in the subjects’ mouths.

If subjects ask questions, the correct response is usually “What do you think?” or “Just do what you’d do if you were at home, on your own.”

For more, see my former post, “User Testing of Websites: Leave Your Agenda at Home“.

Go through your pre-defined tasks

Usually, you’ll start with the Home page. Remember that you’re not looking for subjects’ opinions.

For example, it’s tempting to ask about “first impressions”. But this question usually elicits comments about fonts, colors or the company logo… and this isn’t terribly useful information.

A better question is, “What does this company do?” Then you’ll find out whether the site quickly communicates its purpose. That’s useful information.

Next, move on to your specific, pre-written tasks. Read each task to the subject, plus give them a printed copy. Proceed to the next task when:

• They’re done (or they’ve clearly failed)
• They’re getting uncomfortably anxious, nervous, or embarrassed
• They’ve spent too much time on the task, or
• There’s nothing further to be learned

Be prepared to offer the occasional, polite “course correction” if subjects stray off topic or get hopelessly lost. Testing time is valuable; don’t waste it on irrelevant matters.

Watch carefully, ask questions

What users DO is much more important than what they say. (Be skeptical when users tell you they’d like new features, etc.)

If subjects aren’t thinking out loud, remind them to do so. Continually ask them what they’re thinking, how they’re feeling, what they’re looking for or expecting.

Now and then, it’s okay to stop the user and ask about their expectations. “What do you think will happen next?” But don’t interrupt the flow. If it’s a detailed question, jot it down and ask it after they’ve done the task.

At the end of each task, follow-up if necessary. Figure out why users did what they did, where the confusion lies.

Be flexible

Though you should plot out tasks in advance, don’t obsess over getting through all of them. If an unexpected opportunity presents itself, follow it.

Remember, this is qualitative research: You’re looking for insights, you’re not out to gather performance statistics or to prove anything. So don’t worry about scientific accuracy.

If it feels right, go ahead and change test protocols midstream. As you proceed from subject to subject, alter tasks, skip tasks, add new ones… be engaged in the process. You’ll learn more.

* Steve Krug recommends actually reading the instructions word-for-word to ensure nothing gets missed or mis-communicated. But I find it awkward reading from a script. (It feels like I’m “reading subjects their rights” when I should in fact be trying to make them feel relaxed.) I use bullet points to ensure I cover everything, but stay away from a script. Your mileage may vary.

CP Marketing

Share
Published by
CP Marketing

Recent Posts

Optimizing user experiences with Digital Experience Analytics (DXA) platforms

As consumers become increasingly digitally savvy, and more and more brand touchpoints take place online,…

2 weeks ago

Enabling Value-Based Bidding with Google Tightlock

Marketers are on a constant journey to optimize the efficiency of paid search advertising. In…

4 weeks ago

Resolving “Unassigned” Traffic in GA4

Unassigned traffic in Google Analytics 4 (GA4) can be frustrating for data analysts to deal…

1 month ago

This website uses cookies.