Usability Case Study: iPad vs PC

Wondering how your website performs for users on their latest gadget? Try split-testing your site on participants who use different web-worthy platforms to find out. Testing can help you uncover how user-experience on a desktop or laptop might differ from user-experience on a smartphone or iPad.

To demonstrate this, here’s a little test we ran on 100 participants.

Some participants completed our test using a PC and others with an iPad. Our results help us see how their online experience was being facilitated by the site and just how doable are these tasks on different platforms.

We didn’t have any pre-conceived ideas as to how browsing and completing tasks might differ on the two platforms. We did, however, think it’d be really interesting to see the results, so that in turn we could demonstrate a great way to test, analyse and understand user behaviour. All this, so that you can tune, tweak, and better the online experience you offer.

Our Study

We ran identical unmoderated online studies on the Apple website for participants using either an iPad or a PC. In each study there were three tasks and few follow-up questions.

Loop11 also allows us to validate whether iPad participants were actually using iPads or not. You ask if there were any of these? Well, you might be surprised to hear that there was a handful of PC participants who signed up for the iPad test! Yes, you know who you are.

Don’t worry – you won’t be finding tarnished results in this case study. We threw out the invalid tests.

You can run through the test yourself, here, before reviewing the results. But, of course, our study has since been closed, so your results won’t be counted.

In total, 100 participants completed the test, 50 in each group. Here are the results.

Task 1: Free Surf

“You are thinking about purchasing an iPad. You arrive on the Apple website and want to learn about what the iPad can do and whether it’s a product you might buy. You’re not interested in watching videos, but freely surf the Apple website.”

The first task simply required participants to freely surf the Apple website. Participants were allowed to check out what they could about the iPad. When they felt they’d found sufficient information, they simply had to let us know by marking the task as complete.

This kind of task helps us understand how a browsing experience might differ on these two devices.
The results are rather interesting: iPad users spent longer than PC users during the task of free surfing of the Apple website. On average iPad participants clocked in at 98 seconds, with PC participants coming ahead with 86 seconds.
Remember, iPad participants already own the iPad. The longer time spent on browsing for iPad users does hint at fairly significant browsing differences on the two devices.

Task 2: Shop!

“Fast forward 4 weeks and you’ve gone and bought an iPad! But you now realise you’re going to need a case to protect it. Find a case and put it in your shopping cart. You’re not going to buy it right now, though. Hit ‘Task Complete’ when it’s in your shopping cart.”

The second task required participants to achieve a defined goal: Find a protective case for an iPad and add it to the shopping cart. This task helps us understand the usability and presentation of functions on a website as it’s translated onto the different devices.

Task completion results for iPad and PC users were identical with both groups boasting a 100% completion rate. A success! iPad participants, however, in completing the task took 53% longer than PC participants. At an average of 135 seconds, iPad users were surpassed by the speed of PC participants who hit the mark at 89 seconds.
What’s more, PC users even had to visit one extra page on average – 5 pages on the PC versus 4 on the iPad – to complete the task.

Task 3: iPad battery life

“For future reference, you want to know how long the battery will last before you need to re-charge it. Where would you find this information?”

The final task also presented a clearly defined goal: Find details on the iPad’s battery life. We knew on which page the information was available – there were at least two pages, and we wanted to see how users would go about finding it and how long it would take them.

Given that this task was a little more involved than previous tasks, completion rates are subsequently lower. But we admit, the task completion rates are still impressive by any standards.

iPad users had greater success with 92% completing the task correctly, while PC participants dropped down to a 90% completion rate. Once again iPad users took significantly longer averaging a time of 56 seconds as PC users averaged 38 seconds. That’s 47% longer for the iPad users despite the same number of average page views.

Overview

Adapting websites to fit the needs of users who come from different machines can be a head-walloping task.

Our software aims to help you realise small steps towards great improvements.

To improve it, though, you must first document it. And Loop11 documents user-experience so that it can be improved.

Simplified designs, variations in page layout, consistencies of functionality across all elements, or adaptations on page flow and size—there are tons of ways web professionals work to iron out user-experience on all devices.

But don’t just guess how user-experience might differ. When transitioning your website for a gadget-friendly version, find out where and how users are stumbling so that you can understand why. In this process, usability testing is often the missing lynchpin.

New Website, New Pricing

Welcome to the new Loop11 website. As part of recent and on-going improvements to Loop11, we’ve given the Loop11 website a complete re-design. The new website looks better than ever and is easier to use. It also includes a new introduction video on our homepage. If you haven’t already seen it, check it out.

As part of the new website launch, we have also added a new “Bulk Buy” pricing model. We have recieved many requests from our members about bulk buying and licencing options, so we’ve listened and created additional pricing options.

Here they are:

If you buy this many credits…      You’ll get this many FREE
Up to 5                                                                                None
5 to 10                                                                                 1
10 to 20                                                                               3
20 to 30                                                                               8

If you think you’ll be running more than 30 projects any time soon, particularly those running usability benchmarking and tracking studies you can get an annual licence for just $14,900.

Single credits still cost USD $350. The new pricing info page can be found here: http://www.loop11.com/pricing/

Keep checking back for new Loop11 updates and news.

Happy Testing!

Which City Council has the best website?

Council websites provide important information on an extensive range of services and information for residents, from pet information to waste collection times and many other community service programs. It is important for council websites to be user friendly and easy to use to enable residents to find the information they need easily.

In this independent study, 600 random participants were asked to complete some simple tasks on 6 of the following council websites:

The following task was asked of 600 participants (100 per website):

“You are a new resident in [City Name] and need to find out what day your household waste will be collected for disposal. Find the page of the website with this information?”

Overall, the Darwin City Council website had the highest usability score with Perth City Council scoring the lowest.

View the full detailed report for task times, clicks, completion rates, satisfaction and more. This case study was conducted using Loop11? and participants were sourced from Mechanical Turk.

Recruiting Participants for Unmoderated, Remote User Testing

Unmoderated, remote user testing tools such as Loop11, provide the ability to target a large number of participants (up to 1000 with Loop11). However, recruiting and incentivizing participants to take part in remote, unmoderated testing can be more challenging than traditional moderated, lab-based testing, particularly if one hasn’t done it before.

A recent article on UXmatters.com breaks down and assesses the pros and cons of each method for recruiting participants for remote and unmoderated testing. The full article can be viewed here.

Happy Testing!

New Feature: Extra Participant Filtering

Last month we added participant filtering to reporting which specifically allowed you to include (or exclude) participants who had not fully completed your user tests.  Today we’ve added to the filtering options to allow you to exclude participants who don’t meet certain quality thresholds.

You can now create thresholds based on the number of seconds they spend and/or the number of clicks they take to complete individual tasks.

Many of you have noted that when you go through your reports there are often numerous participants who do not even leave the home page when attempting a task, but still claim to have ‘completed’ it.

Now, it’s easy to exclude from your reporting these participants who don’t make a proper attempt at completing your user tests.

Similarly, participants who go off to make a cup of tea in the middle of your evaluation can be excluded just as easily so you don’t end up with skewed results.

These filtering options will now give your results much greater accuracy and reliability.

Stay tuned…and happy testing!

Don’t forget to become a Loop11 fan on Facebook and Twitter

Usability Testing On The iPad

There has been a lot of negative commentary on the usability of the iPad since it launched, including Jakob Nielsen’s critique of its usability failings. However, what’s even more important than the usability of the iPad is the usability of YOUR website or app ON the iPad.

There are 5 Million iPad users worldwide…and counting. Many of these people browse the internet using their iPads as well as use apps. It’s also worth mentioning that there are another 50 million iPhone users around the world. So the questions is: How usable is your website or app on the iPad?

You can test the usability of any website on the iPad because Loop11 works beautifully with the iPad. It always has actually. Loop11 can be used on an iPad the same way it works on a computer. Since there is no software for participants to download and no javascript code for you to insert a usability project can be created for the iPad in just a few minutes. We even made a video to prove it!

…and just so you know, you can do it all on an iPhone, as well.

Want to be part of a world first usability study? Here’s your chance.

We created a usability study to see whether there are any differences in behaviour when using the internet on an iPad versus a normal computer. We’d love you to take part. We’ll be publishing the results on our blog soon.

If you have an iPad and would like to be a part of the evaluation, use this link:

iPad users only

If you don’t have an iPad but would like to prove that using a normal computer is waaaay better for surfing the internet than an iPad, go here:

Computer Users Only

Happy testing!

BECU (Boeing Employees’) Credit Union Website

We often sneak a quick look at the projects our members are running and we came across this beauty by the clever guys at the interactive agency Possible, who did some re-design work on the BECU (Boeing Employee’s Credit Union) website (www.becu.org).

Their project was designed to give them the answer to just one question, “What do we call the place where our website visitors go to do their…you know…thingamybob banking that you do when you don’t do it at one of the BECU Neighborhood Financial Centers?”

A very simple project was constructed to get them precisely that one answer. Here’s what they did…

Five different hi-fidelity wireframe designs of the BECU homepage were created and hosted on a staging server. Three designs were identical except for the label to this unnamed section of the website. One placed the link in the primary navigation, rather than the utility navigation at the top, and the final design placed the link in the footer.

The alternative labels for the section were:

  • Remote Account
  • Access Mobile & Online Banking
  • Remote Banking, and
  • Online Banking (for the one in the footer)

Five separate projects were set up so there was no task order bias and the task was worded the same way for each project:

“Use the site to find how to view your banking information using your internet-enabled cell phone.”

During the evalution, when participants selected a link that did not direct them to the ‘correct’ location they were presented with the screen below, which is particularly handy as navigating through a wireframed website where only the home page had any functional purpose could have been disasterously confusing for participants.

Four follow-up questions were asked, again always the same for each project.  They were:

  1. How difficult was it to complete this task?
  2. How certain did you feel that the “[Name of the section]” link would take you to the information you were looking for?
  3. What specific information would you expect to see when you click on ” Name of the section”?
  4. What, if anything, would be a better name or label for this information?

All in all a quick, simple and inexpensive way of getting a precise link label to a section of their website growing in importance.

As for the results of the testing…well, go to www.becu.org to find out which option provided the best experience.

New Feature: Participant Filtering

Today we’re unveiling an exciting new feature to help add more value to your project reporting – participant filtering.

When we first launched Loop11 we made the decision to only show the results of participants who fully completed user tests in the reporting. We had a strong belief (and still do) that participants who do not fully complete user tests are not giving your tasks and questions their best efforts. So we excluded them from reporting altogether.

Since launching we’ve had many, many, many requests from our members to make the results of partially completed user tests available. We listened, and so that’s what we’ve done.

Here’s a bit of a tutorial of how it works:

  •   When you log into your account you will now find in your list of Launched Projects a column titled ‘Participants  Start/Finish’ (as shown below). This indicates the total number of participants who commenced your user test  followed by the total number who completed it.

  •   By default, your reporting will only show you the results of participants who fully complete your user tests. But if you want to change that so you can see the results off all participants up until they drop out of your user test you can now go to Settings, tick the box under ‘Include participants in the reporting that have partially completed  the user test’, then click Save Settings.

  •   In your reporting you’ll now see an indication of the total number of participants who completed each task. The early tasks will always have more as participants drop out of your project.
  •   On the right hand side, we’ve included a Settings Indicator that will always tell you whether you are looking at all participants or completed participants so you don’t have to remember.
  •   NOTE: If you set a quota of 100 participants, for example, your project will still only close after 100 FULLY COMPLETED user tests have been collected.

  •   When you’re analysing participants individually, you’ll also be told which participants were guilty of not fully completing your user test.

In the next couple of weeks we’ll be adding a few more useful features to Settings so you can clean up your data by excluding the results of participants who fall outside certain time-based and click-based thresholds that you can customise.

Stay tuned…and happy testing!

« Previous PageNext Page »