Traditionally, one of the big issues people have with usability testing is the large investment of time and money to conduct a proper study. Unmoderated (or remote) usability testing offers an alternative method that is cheaper and easier to run.
Jeff Sauro over at MeasuringUsability.com outlines 10 important things to know about remote usability testing in his most recent blog post.
Here’s a quick breakdown of his post:
It’s growing in popularity. In a survey of User Experience professionals, 23% of respondents (an increase of 28% from 2009) now use unmoderated testing.
Recruiting is much easier. Panel companies (like Cint!) make it simpler for companies to find qualified panelists.
A combination of survey and usability study. Tasks and traditional survey questions help to confirm or reject our hypotheses about our customers.
Much more metrics. Enough usability testing metrics are available now to make you the Nate Silver of your industry.
User video simulates the lab pretty well. You can observe panelists just like you would in a lab.
Setup of usability testing is much faster. In one comparative usability evaluation, the average setup time of unmoderated sessions took about half the time for moderated testing.
It’s more efficient than being in the lab. In the same usability evaluation above, the unmoderated testing team was able to collect data on 26x more users than the lab-based team.
Data collected is very comparable to lab data. MeasuringUsability.com found that overall ease, task completion and task-level difficulty was similar to testing in the lab. It will never be exactly the same as face-to-face testing. But it gets pretty close.
Task completion can (and needs to) be verified. You can validate whether a user has completed a task by a) Asking them a question that can only be answered if the task was completed or b) Set up a trackable URL that shows the user completed the task.
More users = more statistical precision. Since it’s easier and faster to test more users, this larger sample size can help you detect smaller differences and get you more statistically significant results.
Anything to add? Comment below!
Give feedback about this article
Were sorry to hear about that, give us a chance to improve.