When two great minds get together!

Wireframe design and testing just got a whole lot easier…

Testing the usability of wireframes has always been a great Loop11 feature. However Loop11 is not a wireframe design tool. On the other hand, Justinmind is a great wireframe design tool, but it didn’t offer comprehensive wireframe usability testing. So we put two and two together and integrated Justinmind with Loop11.

Now you can create your own wireframes with Justinmind and test the usability of them with Loop11. It’s as simple as a click of the mouse!

How does it work?

Justinmind is a wireframe creation and design tool. It also allows you to upload your wireframes into HTML and test them.

This is where Loop11 comes in.

Once you have created your wireframes with the Justinmind wireframe creation tool. Simply use the Justinmind Usernote feature. Their Usernote feature is where you can upload your wireframes into HTML so that they can be tested.

This is where you will find the Loop11 integration… and we all know how simple Loop11 is to use.

So if you’re looking for a simple way to design and test wireframes, look no further.

Happy testing!

Airline Website Usability: British Airways Soars Ahead!

We thought we would have a look at how user friendly 10 of the world’s leading airline websites are. On a recent overseas trip, I was astonished to see how many people continue to take dangerous or banned items, such as scissors and cigarette lighters through the check-in gates at airports. Since security has become radically tougher in recent years we thought we’d explore how easy (or difficult!) it is to find information about the items you’re not supposed to have in your luggage… So we tested the usability of the ten following airline websites:

The following task was asked of 1,000 participants (100 per website):

“You are taking an overseas holiday next month. Before you go you want to check whether certain items are considered by the airline to be dangerous or banned.  Using the website how can you do this?”

Our participants were sourced from a number of resources, including our Twitter and Facebook accounts, but the vast majority came from Mechanical Turk where we paid the nominal sum of $30 for the bulk of the participants.  Thanks to all those who got involved.

Task Completion Rates:

In general, each website had one page dedicated to banned or restricted items, such as these pages from American Airlines and British Airways.

American Airlines Restricted Items Page.

British Airways Banned Items Page

If the participants found the appropriate page they were deemed to have completed the task successfully, otherwise they were considered to have failed it, or they abandoned the task if it all became too hard.

The results indicate that finding information on dangerous and banned items is rather difficult.  This perhaps provides some clues as to why so many people on my recent trip were still packing them in their luggage.

Chart showing the task completion rates

The British Airways website was the standout performer with 71% of participants completing the task successfully.  Of most concern were Virgin Atlantic and Malaysia Airlines where less than half of participants were able to locate the information.  In the case of Malaysia Airlines, just 31% of participants were able to complete the task.

Additionally, a total of 39% of participants abandoned the task on the Malaysia Airlines website even though more than half went directly from the home page to the Baggage Information landing page where they should have easily found the information.  It would seem the call to action to “Download now” is not sufficient to indicate the PDF document on the Baggage Information landing page is the place to go for this information.

Malaysia Airlines Webpage.

Average Time to Complete Task:

The average time taken to complete the task on each of the ten websites again shows that the British Airways website was the standout performer, with participants completing the task in an average time of 87 seconds.  Malaysia Airlines and Virgin Atlantic once again performed poorly, with the average time for Virgin Atlantic (199 seconds) being more than twice the time taken for those using the British Airways website.

Chart showing the average time to complete the task.

The study also revealed that only Virgin Atlantic and Lufthansa did not have fly-out menus in their main navigation.  Fly-out menus, such as those shown on the American Airlines website below; often result in faster navigation since users are able to see at least the second level navigation links without having to make a click.

American Airlines Webpage With Fly-out Menu.

A deeper look at the path analysis for Virgin Atlantic shows that a quarter (24%) of participants went to the correct section of the website, the Passenger Information landing page in the first instance.  This is a substantially lower result than British Airways and even Malaysia Airlines where more than half navigated to the correct section first.

Ease of Use Rating:

One of the follow-up questions for participants after completing the task was to rate on a 5-point scale how easy it was to use the website.  There was much less variation in these results, which we don’t find surprising.  In face-to-face, lab-based user testing we frequently encounter participants who have a terrible time navigating a website but still comment on how easy the website was to use!  We always felt this was the moderator effect, but perhaps this extends to unmoderated user testing too!

Chart showing the ease of use rating

Overall Usability Score:

To directly compare the usability of one website to another we decided to follow the ISO definition of usability.  ISO 9241-11 defines usability as the “Extent to which a product can be used by specified users to achieve specified goals with effectiveness, efficiency and satisfaction in a specified context of use.”  This gives us three areas to focus on: effectiveness, efficiency and satisfaction.

Combining the scores for the task completion rate (effectiveness), the average time taken to complete the task (efficiency) and the ease of use rating (satisfaction) we can establish an overall score for each of the ten airline websites, which are shown below.

Chart showing the overal usability score.

Not surprisingly, British Airways was heads and shoulders above the rest while Malaysia Airlines and Virgin Atlantic were well behind.  There was little difference between the remaining seven websites.  But clearly there’s a lot more work to be done by airline websites to help people avoid packing those banned and dangerous items.

Tested with Loop11

Usability Case Study: iPad vs PC

Wondering how your website performs for users on their latest gadget? Try split-testing your site on participants who use different web-worthy platforms to find out. Testing can help you uncover how user-experience on a desktop or laptop might differ from user-experience on a smartphone or iPad.

To demonstrate this, here’s a little test we ran on 100 participants.

Some participants completed our test using a PC and others with an iPad. Our results help us see how their online experience was being facilitated by the site and just how doable are these tasks on different platforms.

We didn’t have any pre-conceived ideas as to how browsing and completing tasks might differ on the two platforms. We did, however, think it’d be really interesting to see the results, so that in turn we could demonstrate a great way to test, analyse and understand user behaviour. All this, so that you can tune, tweak, and better the online experience you offer.

Our Study

We ran identical unmoderated online studies on the Apple website for participants using either an iPad or a PC. In each study there were three tasks and few follow-up questions.

Loop11 also allows us to validate whether iPad participants were actually using iPads or not. You ask if there were any of these? Well, you might be surprised to hear that there was a handful of PC participants who signed up for the iPad test! Yes, you know who you are.

Don’t worry – you won’t be finding tarnished results in this case study. We threw out the invalid tests.

You can run through the test yourself, here, before reviewing the results. But, of course, our study has since been closed, so your results won’t be counted.

In total, 100 participants completed the test, 50 in each group. Here are the results.

Task 1: Free Surf

“You are thinking about purchasing an iPad. You arrive on the Apple website and want to learn about what the iPad can do and whether it’s a product you might buy. You’re not interested in watching videos, but freely surf the Apple website.”

The first task simply required participants to freely surf the Apple website. Participants were allowed to check out what they could about the iPad. When they felt they’d found sufficient information, they simply had to let us know by marking the task as complete.

This kind of task helps us understand how a browsing experience might differ on these two devices.
The results are rather interesting: iPad users spent longer than PC users during the task of free surfing of the Apple website. On average iPad participants clocked in at 98 seconds, with PC participants coming ahead with 86 seconds.
Remember, iPad participants already own the iPad. The longer time spent on browsing for iPad users does hint at fairly significant browsing differences on the two devices.

Task 2: Shop!

“Fast forward 4 weeks and you’ve gone and bought an iPad! But you now realise you’re going to need a case to protect it. Find a case and put it in your shopping cart. You’re not going to buy it right now, though. Hit ‘Task Complete’ when it’s in your shopping cart.”

The second task required participants to achieve a defined goal: Find a protective case for an iPad and add it to the shopping cart. This task helps us understand the usability and presentation of functions on a website as it’s translated onto the different devices.

Task completion results for iPad and PC users were identical with both groups boasting a 100% completion rate. A success! iPad participants, however, in completing the task took 53% longer than PC participants. At an average of 135 seconds, iPad users were surpassed by the speed of PC participants who hit the mark at 89 seconds.
What’s more, PC users even had to visit one extra page on average – 5 pages on the PC versus 4 on the iPad – to complete the task.

Task 3: iPad battery life

“For future reference, you want to know how long the battery will last before you need to re-charge it. Where would you find this information?”

The final task also presented a clearly defined goal: Find details on the iPad’s battery life. We knew on which page the information was available – there were at least two pages, and we wanted to see how users would go about finding it and how long it would take them.

Given that this task was a little more involved than previous tasks, completion rates are subsequently lower. But we admit, the task completion rates are still impressive by any standards.

iPad users had greater success with 92% completing the task correctly, while PC participants dropped down to a 90% completion rate. Once again iPad users took significantly longer averaging a time of 56 seconds as PC users averaged 38 seconds. That’s 47% longer for the iPad users despite the same number of average page views.


Adapting websites to fit the needs of users who come from different machines can be a head-walloping task.

Our software aims to help you realise small steps towards great improvements.

To improve it, though, you must first document it. And Loop11 documents user-experience so that it can be improved.

Simplified designs, variations in page layout, consistencies of functionality across all elements, or adaptations on page flow and size—there are tons of ways web professionals work to iron out user-experience on all devices.

But don’t just guess how user-experience might differ. When transitioning your website for a gadget-friendly version, find out where and how users are stumbling so that you can understand why. In this process, usability testing is often the missing lynchpin.

New Website, New Pricing

Welcome to the new Loop11 website. As part of recent and on-going improvements to Loop11, we’ve given the Loop11 website a complete re-design. The new website looks better than ever and is easier to use. It also includes a new introduction video on our homepage. If you haven’t already seen it, check it out.

As part of the new website launch, we have also added a new “Bulk Buy” pricing model. We have recieved many requests from our members about bulk buying and licencing options, so we’ve listened and created additional pricing options.

Here they are:

If you buy this many credits…      You’ll get this many FREE
Up to 5                                                                                None
5 to 10                                                                                 1
10 to 20                                                                               3
20 to 30                                                                               8

If you think you’ll be running more than 30 projects any time soon, particularly those running usability benchmarking and tracking studies you can get an annual licence for just $14,900.

Single credits still cost USD $350. The new pricing info page can be found here: http://www.loop11.com/pricing/

Keep checking back for new Loop11 updates and news.

Happy Testing!

Which City Council has the best website?

Council websites provide important information on an extensive range of services and information for residents, from pet information to waste collection times and many other community service programs. It is important for council websites to be user friendly and easy to use to enable residents to find the information they need easily.

In this independent study, 600 random participants were asked to complete some simple tasks on 6 of the following council websites:

The following task was asked of 600 participants (100 per website):

“You are a new resident in [City Name] and need to find out what day your household waste will be collected for disposal. Find the page of the website with this information?”

Overall, the Darwin City Council website had the highest usability score with Perth City Council scoring the lowest.

View the full detailed report for task times, clicks, completion rates, satisfaction and more. This case study was conducted using Loop11? and participants were sourced from Mechanical Turk.

Recruiting Participants for Unmoderated, Remote User Testing

Unmoderated, remote user testing tools such as Loop11, provide the ability to target a large number of participants (up to 1000 with Loop11). However, recruiting and incentivizing participants to take part in remote, unmoderated testing can be more challenging than traditional moderated, lab-based testing, particularly if one hasn’t done it before.

A recent article on UXmatters.com breaks down and assesses the pros and cons of each method for recruiting participants for remote and unmoderated testing. The full article can be viewed here.

Happy Testing!

New Feature: Extra Participant Filtering

Last month we added participant filtering to reporting which specifically allowed you to include (or exclude) participants who had not fully completed your user tests.  Today we’ve added to the filtering options to allow you to exclude participants who don’t meet certain quality thresholds.

You can now create thresholds based on the number of seconds they spend and/or the number of clicks they take to complete individual tasks.

Many of you have noted that when you go through your reports there are often numerous participants who do not even leave the home page when attempting a task, but still claim to have ‘completed’ it.

Now, it’s easy to exclude from your reporting these participants who don’t make a proper attempt at completing your user tests.

Similarly, participants who go off to make a cup of tea in the middle of your evaluation can be excluded just as easily so you don’t end up with skewed results.

These filtering options will now give your results much greater accuracy and reliability.

Stay tuned…and happy testing!

Don’t forget to become a Loop11 fan on Facebook and Twitter

Usability Testing On The iPad

There has been a lot of negative commentary on the usability of the iPad since it launched, including Jakob Nielsen’s critique of its usability failings. However, what’s even more important than the usability of the iPad is the usability of YOUR website or app ON the iPad.

There are 5 Million iPad users worldwide…and counting. Many of these people browse the internet using their iPads as well as use apps. It’s also worth mentioning that there are another 50 million iPhone users around the world. So the questions is: How usable is your website or app on the iPad?

You can test the usability of any website on the iPad because Loop11 works beautifully with the iPad. It always has actually. Loop11 can be used on an iPad the same way it works on a computer. Since there is no software for participants to download and no javascript code for you to insert a usability project can be created for the iPad in just a few minutes. We even made a video to prove it!

…and just so you know, you can do it all on an iPhone, as well.

Want to be part of a world first usability study? Here’s your chance.

We created a usability study to see whether there are any differences in behaviour when using the internet on an iPad versus a normal computer. We’d love you to take part. We’ll be publishing the results on our blog soon.

If you have an iPad and would like to be a part of the evaluation, use this link:

iPad users only

If you don’t have an iPad but would like to prove that using a normal computer is waaaay better for surfing the internet than an iPad, go here:

Computer Users Only

Happy testing!

« Previous PageNext Page »
Want more inspiration?
Join the Fab-UX 5!

Five links to amazing UX articles,sent to you once a week.

No SPAM, just pure UX gold!

No Thanks