Usability Case Study: iPad vs PC

Wondering how your website performs for users on their latest gadget? Try split-testing your site on participants who use different web-worthy platforms to find out. Testing can help you uncover how user-experience on a desktop or laptop might differ from user-experience on a smartphone or iPad.

To demonstrate this, here’s a little test we ran on 100 participants.

Some participants completed our test using a PC and others with an iPad. Our results help us see how their online experience was being facilitated by the site and just how doable are these tasks on different platforms.

We didn’t have any pre-conceived ideas as to how browsing and completing tasks might differ on the two platforms. We did, however, think it’d be really interesting to see the results, so that in turn we could demonstrate a great way to test, analyse and understand user behaviour. All this, so that you can tune, tweak, and better the online experience you offer.

Our Study

We ran identical unmoderated online studies on the Apple website for participants using either an iPad or a PC. In each study there were three tasks and few follow-up questions.

Loop11 also allows us to validate whether iPad participants were actually using iPads or not. You ask if there were any of these? Well, you might be surprised to hear that there was a handful of PC participants who signed up for the iPad test! Yes, you know who you are.

Don’t worry – you won’t be finding tarnished results in this case study. We threw out the invalid tests.

You can run through the test yourself, here, before reviewing the results. But, of course, our study has since been closed, so your results won’t be counted.

In total, 100 participants completed the test, 50 in each group. Here are the results.

Task 1: Free Surf

“You are thinking about purchasing an iPad. You arrive on the Apple website and want to learn about what the iPad can do and whether it’s a product you might buy. You’re not interested in watching videos, but freely surf the Apple website.”

The first task simply required participants to freely surf the Apple website. Participants were allowed to check out what they could about the iPad. When they felt they’d found sufficient information, they simply had to let us know by marking the task as complete.

This kind of task helps us understand how a browsing experience might differ on these two devices.
The results are rather interesting: iPad users spent longer than PC users during the task of free surfing of the Apple website. On average iPad participants clocked in at 98 seconds, with PC participants coming ahead with 86 seconds.
Remember, iPad participants already own the iPad. The longer time spent on browsing for iPad users does hint at fairly significant browsing differences on the two devices.

Task 2: Shop!

“Fast forward 4 weeks and you’ve gone and bought an iPad! But you now realise you’re going to need a case to protect it. Find a case and put it in your shopping cart. You’re not going to buy it right now, though. Hit ‘Task Complete’ when it’s in your shopping cart.”

The second task required participants to achieve a defined goal: Find a protective case for an iPad and add it to the shopping cart. This task helps us understand the usability and presentation of functions on a website as it’s translated onto the different devices.

Task completion results for iPad and PC users were identical with both groups boasting a 100% completion rate. A success! iPad participants, however, in completing the task took 53% longer than PC participants. At an average of 135 seconds, iPad users were surpassed by the speed of PC participants who hit the mark at 89 seconds.
What’s more, PC users even had to visit one extra page on average – 5 pages on the PC versus 4 on the iPad – to complete the task.

Task 3: iPad battery life

“For future reference, you want to know how long the battery will last before you need to re-charge it. Where would you find this information?”

The final task also presented a clearly defined goal: Find details on the iPad’s battery life. We knew on which page the information was available – there were at least two pages, and we wanted to see how users would go about finding it and how long it would take them.

Given that this task was a little more involved than previous tasks, completion rates are subsequently lower. But we admit, the task completion rates are still impressive by any standards.

iPad users had greater success with 92% completing the task correctly, while PC participants dropped down to a 90% completion rate. Once again iPad users took significantly longer averaging a time of 56 seconds as PC users averaged 38 seconds. That’s 47% longer for the iPad users despite the same number of average page views.

Overview

Adapting websites to fit the needs of users who come from different machines can be a head-walloping task.

Our software aims to help you realise small steps towards great improvements.

To improve it, though, you must first document it. And Loop11 documents user-experience so that it can be improved.

Simplified designs, variations in page layout, consistencies of functionality across all elements, or adaptations on page flow and size—there are tons of ways web professionals work to iron out user-experience on all devices.

But don’t just guess how user-experience might differ. When transitioning your website for a gadget-friendly version, find out where and how users are stumbling so that you can understand why. In this process, usability testing is often the missing lynchpin.

5 Responses to “Usability Case Study: iPad vs PC”

  1. Two criticisms of the methodology:
    1) you have used the Apple website for the study, even though one group is self-selected as interested in Apple, so there is a strong possibility that their browsing time will be affected by their existing interest or knowledge (for example, they might hang around and read more about Apple stuff). A neutral website would have been more appropriate.
    2) as you have only tested this on one website, your test may be affected by the design of that particular site. To ensure that you are actually testing the device, a range of different websites should be used.

  2. Interesting. Did you take into account differences in level op experience of the participants. I can imagine people owning an iPad ar on top of the technology foodchain and mayb this group was more experienced than PC users.

  3. Mark says:

    Just a thought – Did you take into account the different processor speeds on a PC vs the iPad? A fast PC will load and render a page quicker, which could add up over a few pages during a task.

    If this wasn’t a factor, then it would be interesting to know where the iPad users slowed down in comparison to their PC counterparts? What navigation/page elements took longer to use? Or was the screen size a factor when looking for links/information?

  4. David Travis says:

    I think that @allthebuttons criticism is spot on. What if you’d asked participants to visit microsoft.com to troubleshoot problems with Windows 7? (I’m not saying you should of course as this task is equally biassed — just making the point).

  5. tbiddle says:

    Allthebuttons/Stefan Wobben/David Travis, your criticisms are completely valid.

    To run a scientifically valid and robust research exercise a more rigorous selection process for the participants would most certainly need to be done. We simply posted an article on our blog and people kindly responded. It was never our intention to put together a rock-solid methodology. Nevertheless, the exercise highlights some of the issues people might need to consider when developing websites that they expect to be used on different platforms and devices.

    This was really the primary intention of the study, as well as to demonstrate that online usability testing on an iPad is something that can now easily be done and should be considered, where appropriate, by those doing usability testing. No longer is website usability testing on a PC the only thing to be concerned about.