I don’t need to tell you how important wireframing is for rapidly developing a great site. It’s become standard practice because it enables teams to visualize and validate their concept before diving into development.
The renowned architect Frank Lloyd Wright once said, “You can use an eraser on the drafting table or a sledgehammer on the construction site”. Which option do you think is easier on timelines, budget, and momentum?
Wireframing saves time and money, period.
But… there’s something even more effective than creating wireframes. It’s creating wireframes and then running usability tests on the wireframes to get solid feedback from “new eyes”.
We talked recently about why usability testing is so important. It enables you to get inside your users’ heads so you can make a site that they find easy to use and want to stick around on.
The beauty of running usability tests on wireframes is, you get all of that valuable feedback before you spend a minute or a penny on development. Your usability testers can interact with your wireframes as if they were a fully developed site. You can then integrate the insight you get from the usability tests into your designs before you start coding.
Usability testing will save you time and money. Who wants to find out, after they spent resources on development, that the design is intuitive to the designers but not to the users?
So, don’t stop at wireframing. To get a huge headstart on development, take the next step and run a usability test on your wireframes. Go ahead and create your first test now – it’s both easy and free.
Want to learn more about how to get the most out of your wireframe and usability testing efforts? Join us for a free webinar on Thursday, July 12 with three user experience veterans: Loop11 CEO Toby Biddle, Balsamiq CEO Peldi Guilizzoni, and Balsamiq Head of User Experience Mike Angeles. UPDATE: You can find a recording of the webinar here.
If you have a new website or web app, you probably want to gauge the performance quality of your site. And you probably know that this will involve some sort of usability testing.
But which is the best method for usability testing? A team of researchers studied this very topic. Below is a summary of their study – and their results.
The team compared two methods of usability testing:
1. Traditional lab-based testing method – pick a group of people and have them go through the usability test in a controlled, in-house environment
2. Remote web-based testing method – pick users at random and have them go through the usability test remotely
Their experiments and results were published in a paper titled An Empirical Comparison of Lab and Remote Usability Testing of Web Sites1. Let me give you the highlights.
Two user groups tested the usability of a set of two websites: 8 users participated in the experiments in a traditional lab-based environment, and 38 users participated remotely over the web.
Experiment #1 consisted of 17 tasks to be completed on a website that was meant to be of both informational and transactional value to end-users. It provided information about retirement savings, pension, medical and dental coverage, payroll deduction, direct deposit and financial planning. It also enabled users to set their own payroll deduction.
Experiment #2 consisted of 13 tasks to be completed on a second website that was meant to be of purely informational value; it provided details about stock quotes, company news, research, and investment strategies.
The tasks in both experiments were designed to judge whether:
• The website was visually appealing
• The menus and links were easy to navigate
• The information was arranged in a logical, easy-to-access manner
• Individual page formatting was good
• Content used appropriate terminologies
• The quality of the web content met the expectations of the target users
• The site was easy to use overall
• The user was able to complete the desired tasks within a reasonable time span
• The user would be interested in returning to the site in the future
The conclusion of the study was based on the following criteria:
• Task completion – percentage of users who successfully completed the given task
• Average time spent to complete each task
• Subjective rating quality
A high correlation was noted in the following conclusions:
• The time spent to complete the tasks and the difficultly experienced in completing tasks were strikingly similar in both user groups – indicating similar behavior irrespective of environments.
• The quality of typed comments and the kind of data extractable from remote users were as rich as those that could be obtained in direct laboratory conditions.
If click streams and screenshots were enabled in remote test conditions, the quality of data obtained will likely be richer. Remote testing conditions are more cost feasible with the ability to include diverse user groups to uncover unique usability issues. Both testing conditions each reveal certain unique usability parameters as well.
The study concludes: The behavior of test users is strikingly similar in lab and remote usability tests. This is reassuring, and indicates that the different environments do not lead to different kinds of behavior.
In other words, take your pick – both methods of usability work and each have their own advantages and disadvantages. In-house testing may ensure that you gather more detailed input, while remote testing is usually less expensive. The most important thing is that you do some form of usability testing.
Footnote 1. An Empirical Comparison of Lab and Remote Usability Testing of Web Sites, by Tom Tullis, Stan Fleischman, Michelle McNulty, Carrie Cianchette, and Marguerite Bergel.
We have some exciting news. We’ll soon be on the road in the US and are fired up to meet you!
We will be at the Big Design Conference from May 31 to June 2nd just outside of Dallas, Texas. Stop by and say hello at Booth #12! We would love to chat. The biggest ideas in strategy, UX, design, gaming, mobile, usability and development will be discussed here!
Right afterwards, we will be in Henderson, NV from June 5-7 at the UPA International Conference. Come visit us at Booth #20 and be a part of the most interactive usability & UX conference in the world!
Hope to see y’all there! Don’t be shy.
Toby Biddle and the Loop11 Team
How many millions of people in the 80s and 90s had trouble figuring out how to program their VCRs? My older sister was the only one in our house who knew how to do it, and that’s because she was a little nerdy and read the manual.
This is one of the most classic examples of a usability problem. No one wants to read a manual or call a support line – or even spend more than five minutes trying to figure things out on their own; people want and expect to be able to use products out of the box. This is the essence of effective usability design, and the same principle applies to website usability.
What is website usability?
Website usability has two aspects:
1. The primary aspect is about meeting your users’ goals and delivering a satisfying user experience. Is your site clear, concise, and intuitive to them? Can they quickly and easily find what they’re looking for? Are the consequences of pressing buttons and clicking links unambiguous to them?
The email management tool Mailchimp.com is one of my favorite examples, because usability is one of its main selling points. And indeed all of the most common things that you’d want to do with it are laid out clearly on the front page: Create a Campaign, Manage a List, View a Report. The button for the most common action – creating a campaign – is distinguished by its orange color and large label. It’s hard to miss.
Ideally your site is so intuitively laid out that the question of usability never enters your users’ minds; it simply works the way they expect it to. (In fact, users generally only think about usability when they’re frustrated by something that is not usable to them.)
2. The secondary aspect of website usability is more subtle; it’s about fulfilling the goals your company has for the site. Does your site’s design nudge visitors in the direction you want them to go? Are the features that are most important to you front and center?
Here’s a quick example: Laura Roeder Studios (lauraroeder.com) offers social media training and tips to small business owners. Building its list of weekly newsletter subscribers is important to them – as shown by the fact that the email newsletter sign-up is given prime real estate on their site:
The value of user testing
Of course we designed our site to be usable, you might be thinking. Why wouldn’t we? Here’s the thing: you will never know how usable your site truly is until you test it with people outside of your organization.
Is usability subjective? Could something be intuitive to one person and not another? Absolutely, and therein lies one of the two main values of user testing: testing across groups reveals quantifiable trends. If 7 out of 10 people can’t figure out how to navigate to checkout on your website, that tells you something very valuable which you will want to address.
The other main value of user testing is that it’s unambiguous. Ask a user an open-ended question – Is our website usable to you? – and you will likely get a general reply. But give someone a set of specific tasks to execute on your website, and look at the results, and you will see unambiguously where the stumbling blocks are, if any. And that means you can fix them.
User testing enables you to get inside your users’ heads and create a site that will truly be easy and pleasant for them to use. And that, of course, is win-win for both them and you.
Go ahead and create your first user test to get a quantifiable, unambiguous handle on your site’s usability (it’s both easy and free).
We marketing geeks have a certain way of trolling an analytics account — but there’s an executive-level look, too — business owners who prefer to leave this in-the-weeds analytics stuff to their underlings are missing several things critical to the way they understand their business.
In this post, I’m going to talk about a few critical business questions which CEOs can and should answer for themselves, on an ongoing basis, through a shallow-dive look at Google Analytics.
Where is my most valuable traffic coming from?
When a website is your baby, traffic gets you excited. You know what I’m talking about. Somewhere nerdy within, you silently squeal with joy when your traffic numbers are riding high. Perhaps you’ve had a record day!
Well that’s great. But what if it means nothing for your business?
The thing is, web traffic is little more than bits and bytes clicking away in an electronic ant farm you’re hoping will pay your bills. They’re numbers. Why do we think they’re so important? Because they might convert into customers, and customers pay us, right? Precisely. And that’s why, if there’s just one analytics report business owners keep a pulse on, it must be the Traffic by Source report.
First, understand which sources are sending you visitors who actually give a sh*t about your site and what you’re up to. Pages/Visit and Avg. Time on Site measure whether a visitor even wound up on your site on purpose, and if they did, how long they’re willing to stick with you in the attention-sparse web landscape.
Personally, I tend to get particularly enthusiastic about new organic traffic. If you’re suddenly bringing in 150 new visitors every day as a result of your awesome blog post unveiling a new curried eggplant recipe, that’s super — unless their engagement metrics are a fraction of what you see from folks who found you through other channels. Right?
Pay attention to your conversion rates by traffic source. You might notice that 10% of your traffic accounts for 50% of conversions. All referrers, ad channels and social media outlets are not created equal.
Some sources of traffic will bring you fresh eyeballs and other sources will yield more returning visitors. Which does your business value more? If you’re an e-commerce site, does your average first-time shopping cart differ significantly from that of a returning customer’s?
Look carefully at the propensity of each traffic source to bring you new visitors. Once you have a returning customer, it’s likely you’re touching them through other marketing channels, but a brand new prospect — that’s gold.
How are people experiencing my site?
What proportion of your visitors see your site on their iPad? Their mobile device? Chrome or Internet Explorer (do people still use that browser)? Knowing the breakdown is part of understanding your customer base.
Maybe 30% of your users come to your site on a Mac, and your development team uses nothing but PCs. Sure, they might QA the site on a Mac, but this 30% slice of your prospect pool might not be getting the love they deserve. Maybe your homepage looks awful on the iPad, or your paid search landing pages don’t load the proper information above the fold for all users. If you don’t pay attention to how potential customers are experiencing your business on the web, you can’t control the message.
What content do people care about?
By understanding what pages on your site people spend time on, how they behave there, and what they do next, you can gain unparalleled insight into your business.
Just as a brick and mortar store owner would find it useful to develop an understanding of which store aisles draw the most customers and how the organization of her store impacts revenue — “did you find everything okay today?” — similarly, you as a business owner with a website should understand how your prospects spend their time there.
At a minimum, you should be able to recite:
1) Which pages are most visited? (your homepage, your blog, your shopping cart, your “about us” page, etc.)
2) Which pages attract the most repeat visitors? What about first timers?
3) Is new content appearing in this report, or has your website succumbed to content idleness?
Keep your site fresh by blogging and creating new content. You’ll get more organic search traffic and give your browsing population a treat to come back for. You want to see new pages indexing in this report, every few weeks at the very least.
4) Which pages have the highest bounce rate?
Is it because they suck, or because they’re a natural “exit point” for your site, like the order confirmation page? If they’re an exit point, are you providing a natural next step for your visitor? What should they do after purchasing — tell their friends to get a gift card? Share via social media? In general, analytics are a necessary and low-effort way to gather this data, but you’d also be wise to find a deeper understanding of how your users explore your site via usability testing.
All in all, here’s the bottom line. You’re busy. You’ve got a business to run, and minutes you spend in the weeds digging through statistics are minutes you could be spending elsewhere. But if you just have one hot minute to review your site analytics, I’m positing there are three places to keep your focus: traffic sources, browsers/devices, and site content.
About the author
Igor Belogolovsky is cofounder of Clever Zebo, a group of online marketing strategy experts dedicated to helping businesses grow revenue quickly and measurably by making smart moves on the web. For more good stuff, hang out with the Zebo on Twitter and peruse the Conversion Optimization blog.
We are proud to finally announce that Loop11 has beaten a swathe of worldwide rivals to secure a licence deal with Microsoft‘s Office Experience Group.
Under the terms of the licence agreement, Microsoft’s Office Experience Group will utilize Loop11’s to conduct its online usability testing. The win came after the team’s extensive benchmarking exercise of usability tools in the world marketplace.
We were chosen due to our ability to conduct rapid online usability testing across the range of Microsoft Office applications. In addition it was the speed of software set up and implementation, ease of use, combined with the ability to meet strict security requirements, as well as the strong analysis capabilities that placed Loop11 in the winning position.
Our CEO, Toby Biddle, commented, “We’re delighted that a giant such as Microsoft has awarded Loop11 the business. Traditionally, usability testing is conducted in the lab and relies on the more expensive and invasive method of recruiting participants and behavior observation. Loop11 enables you to conduct fast online usability testing in over 40 languages, making it unbeatable for running worldwide projects from the comfort of your own office.”
Launched in 2009, the software was designed by Loop11’s team of developers to meet the needs of its sister company, usability consulting firm UsabilityOne. “We were always seeking a cost effective usability tool that generated reliable statistics,” continued Biddle. “Before we developed Loop11, we found lab-based studies to be too small in sample size and although there were other software products in the marketplace, we found them inflexible and prohibitively expensive to be able to run iterative usability studies throughout the development of a website. The answer was to build our own software based upon our 15 years of usability consulting experience.
“The beauty of Loop11 is its ability to be used by companies not just of Microsoft’s size, but also by small businesses, educational institutions, not-for-profits and government who want to run their own usability studies, continued Biddle. “With the increase in the number of digital applications, it has become more important for businesses to be smart about testing usability.”
If you too want to test like Microsoft grab yourself an annual licence starting at just $1,900 enquire about one NOW!
Samaritans is a charity in the UK and Ireland specialising in confidential emotional support. It offers its invaluable service 24 hours a day for people who are experiencing feelings of emotional distress, despair or suicidal thoughts, using trained volunteers to respond and lend help to phone calls, emails and letters.
Relying entirely on donations and volunteer help, Samaritans’ website is the crucial portal through which the charity operates. To improve the charity’s on-line portal, Samaritans sought the services of user-centred digital communications’ specialists, SiftGroups, with a view to redesigning its website.
It was unknown whether the established Samaritans’ website was meeting the needs of current donors, volunteers and those needing emotional assistance.
SiftGroups ran a project to explore the usability of Samaritans’ established website with a view to informing how to shape its approach to the forthcoming website redesign.
SiftGroups conducted an online usability appraisal of the current Samaritans website with 149 participants, providing an overall snapshot of how well core strategic tasks were performed by the site’s main audience types. The key areas SiftGroups assessed included vitally important and fundamental site tasks for the charity:
- • Getting Help
- • Making Donations and
- • Becoming a Volunteer.
The study was distributed to both “warm” and “cold” users that came from Samaritans’ wider internal email database. Help tasks and questions were devised linking participants to the actual site to better determine user satisfaction, validate task completion and enable SiftGroup’s to form a strategic recommendation for the navigation of the new site design.
THE ANALYSIS AND RESULTS
Upon completion of the evaluation, the real-time reporting capability allowed SiftGroups to track and evaluate the task completion rates and monitor the navigation path of participants. The results showed that only half of the participants successfully completed the tasks, identifying in more detail those areas requiring addressing for the site remodeling.
In particular, the results showed a significant issue with the current site navigation when it came to those in need of finding ways of talking to friends they’re worried about with 66% of participants either failing or abandoning the task of “talking to someone” altogether.
The results also revealed the need to improve the retrieval of the all important real life case studies, which encourage those thinking of seeking help as well as volunteers and donors:
As a result of the study, SiftGroups was able to provide a strategic recommendation for an improved usability experience using hard statistics to back up new navigation models.
LOOKING TO DO SOME SIMILAR, EFFECTIVE USABILITY TESTING?
With its proven track record, Loop11 will help you get the most out of your website. Providing the ultimate online usability testing experience, Loop11 will reveal some startling home truths about your website usability generating quantifiable results and delivering real time performance without the cost of lab-based testing sessions. It’s fast, affordable, simple to use and easy to get started straight away from the comfort of your own office. Sign-up now or discover another Loop11 success story here.
The guys at Balsamiq asked me recently to write a brief article for their customers on how to conduct usability testing with Loop11 on prototypes and wireframes built with Balsamiq. My article below shows the steps involved in putting a simple Loop11 project together to test a prototype of the Kayak website.
We’re excited to announce the first of a number of new features for Loop11 in 2012 – Clickstream Analysis. The clickstream analysis will replace the ‘Most common navigation path’ by allowing you to analyze task navigation graphically and instantly understand how visitors navigate a task through your website.
The clickstream report provides a graphical representation of participants’ navigation through the website so you can see their journey, as well as the path they took before abandoning or failing a task.
You’ll notice that we made the visualization highly interactive so you can interact with the graph to highlight different pathways, and to see detailed information about specific pages. For example, if you want to dive deeper into your pages, you can hover over the node to see more information at a glance.
Below you can see an example of a task that performed well in usability testing (in this instance with a task completion rate of 92%). A quick look at the analysis shows that 90% of participants went directly to the success page from the homepage. The orange lines are a visual indication of the magnitude of participants who failed the task at different points in their journey through the website. Navigation through the website is clean and uncomplicated, which should be the case when the participant has a clear direction.
By contrast, the clickstream below is for a task that was performed comparatively poorly, where the task completion rate was only 32%. In this example, there is clearly confusion as to where participants should navigate from the home page to complete the task.
This is our first step in tackling clickstream analysis and we look forward to hearing your feedback as you begin to use the report in the coming weeks. We’re excited to bring you the first of a number of new features for the year, so stayed tuned for more!
As always, we welcome your input on how we can make the clickstream analysis more useful for you, so let us know in the comments below.
How many participants should be used for Online, Quantitative Usability Testing?
Qualitative usability testing has traditionally been based around small sample sizes of 5-20 participants. However, the growth of online testing tools and quantitative usability research is changing the game. Whist many experts agreed that for qualitative, lab-based testing small samples sizes of 5-20 participants is sufficient. Online user testing has created a new wave of analysis such as benchmarking, A/B testing, competitor comparison, validating and much more. These kinds of quantitative analyses require larger numbers of participants to validate the data.
For example: Let’s say your company is testing two different versions of wireframes so management can decide and approve one to implement and allocate resources to. It would not be very good practice to use 10 participants and get a 60%-40% success rate. It would be very hard to validate a study and implement a strategy based on 10 participants. However, if 500-1000 participants were used, that data would be a lot more accurate, and management would have valid data to approve of the findings.
Some specialists, such as Usability Sciences, recommend up to several thousand participants for more high level quantitative testing such as, click-stream data, or multiple cross-tabulations. They have broken down the % error for margin based certain numbers of participants. Read the full article here.