Channels ▼

The Culture of Usability

Now that most of us agree that usability testing is an integral investment in site development, it's time to recognize that the standard approach falls short. It is possible to do less work and get better results while spending less money. By bringing usability testing in-house and breaking tests into more manageable sessions, you can vastly improve your online offering without affecting your profit margin.

The Standard Approach

Here's the classic scenario: Your team is developing a new release of your Web site. The overall schedule is between three and six months, during which time you develop the concept, design the new site, build out the system, test it, and then launch.

If you're like most Web teams, you've planned for a round of usability testing sometime just before or after the build phase. So you hire a usability consultant, who charges, say $15,000. This consultant rents out a testing facility that has an extra room with one-way glass, so your team members can watch as a dozen users attempt to navigate the new pages. After the tests are done, the consultant goes away for a week and writes up his or her findings in a long report. The report is condensed into an executive summary and (if you've paid for the deluxe package), a highlights video.

Recurring Problems

What's wrong with the classic approach? Plenty.

It's poorly timed. This usability testing model provides a gigantic mass of particularly valuable insight at exactly the wrong time. In the push to stay on schedule, designers and developers have little chance of devising thoughtful solutions to the problems described in the report. To address even the most critical problems means adding more time for design, and therefore postponing the scheduled launch. This is rarely a viable option, so most of the findings are "put off for the next release."

The scale is unmanageable. By the time the next release comes around (say, three months later), there's a whole new list of other improvements to make, and the items put off from the first round of usability testing remain unresolved. In the end, the only problems that have much chance of being resolved are those enumerated in the highlights reel and executive summary.

It doesn't promote employee growth. Designers don't come away from the usability tests with a general understanding of what is and isn't working in the interface. A consultant does the actual testing, which means that the consultant gains the understanding, not the design team. And then the consultant leaves.

It isn't the consultant's fault, really—the report and other deliverables are an attempt to condense and share what he or she has learned. Something is always lost in the translation, though, and the design team is too pressed for time to spend hours reviewing a lengthy report in depth. Designers and developers can attempt to overcome this by sitting in on a few of the tests, but this is an incomplete solution at best, and watching only one or two users can sometimes prove misleading.

It provides lackluster results. So what's a typical outcome of classic usability testing? You've spent a whole lot of money and a substantial amount of time. You have a huge report itemizing problems with the product that you've been laboring to design, which makes you feel pretty lousy. Usually, the report uncovers some gigantic problem that you can't hope to solve, which makes you feel even worse. But part of that report lists a handful of important changes that can and must be made before you launch. Taking care of this short list is enough to make the test worthwhile, but you can see that many more fixes are necessary, even though they'll probably never happen.

4 Keys to Usability Culture

1. More frequent testing
2. Smaller scale
3. Internal ownership
4. Immediate fixes

It's no wonder people have resisted usability testing. Although it improves the interface, it's an expensive disruption that can leave you feeling less secure. Is usability testing important? Absolutely. Is there a better way to approach it? Without a doubt.

Developing a Usability Culture

The difference is in the attitude. In the classic scenario I described above, usability testing is a special event. It's a behemoth, an isolated activity that happens at a fixed point in the development cycle. It's expensive, so each time you want to perform a test, you have to lobby management to set aside time and money for it. Because it's a one-time event, the scope of the test is big: lots of users clicking through lots of tasks. The findings are big as well. In fact, too big for the organization to handle all at once, so the impact is much smaller than it ought to be.

The most successful organizations conduct usability tests frequently, results are quickly integrated into the product, and the total cost of the testing program is smaller. How do they manage it? By developing what I call a culture of usability.

These companies have accepted usability testing as a basic life-sustaining system for their product. If developers are a site's heartbeat (pushing out new code on a regular basis), and designers are the lungs (infusing it with fresh, life-sustaining energy), then usability testing is kind of like the liver. It's a filter that siphons out the toxic sludge from your interfaces. You can't just add in a usability test every six months or so and hope to have a clean interface. When you overload your liver, the unfiltered crud spills over into your bloodstream and makes you sick. To be effective, usability testing needs to be done a little bit at a time, all the time.

In this culture-of-usability model, the tests are run on a regular, fixed schedule (once a month or more) with a small number of users. Any interface parts that the team has questions about at that time are put into the test. A trained staff within your company runs the tests on site, so the only cash expenses are for hiring a professional recruiter and paying incentives to your test participants.

It will cost you about $2,000 to $3,000 for equipment and training to get started, and then $1,200 per test thereafter. Thus, once you've paid the start-up costs, you can test far more often on the same budget (see "Usability Testing Budget," below, for more details). In this scenario, the tester and the designer are coworkers (ideally partners), so each test increases the institutional knowledge, and the findings are more likely to actually make it into the product.

In other words, to improve the overall quality of your usability program, reduce your expectations. That isn't to say that you should lower your standards. On the contrary, you raise them. You test fewer aspects of the product so that you can understand them better. You test with fewer participants so that you can test more often.

Usability Testing Budgets
If you tell management that you can cut 50 percent of your budget and provide six times more service, they're likely to fund your idea. Actual costs will vary*, but the budget below shows how the two approaches to usability tests compare.

  The Old Way The New Way
What you get for that • 2 rounds, 12 participants each
• 0 trained testers
• Results are delivered as long reports
• Executive Summary recommendations are addressed
• 12 rounds, 6 participants each
• 2 trained testers
• Results are delivered as short, manageable action-lists
• Most recommendations are addressed throughout the year
Number of sessions per year 2 (assuming 2 major product releases) 12 (1 per month)
How It Breaks Down
Training None • No cost: usability Web sites
• $100: books
• $2,000: workshops (assuming 2 employees at $1,000 per workshop)
Equipment • provided by consultant • $1,200: video camera, tripod, cables, T.V. (optional, for remote viewing)
• Profession recruiter
• Stipend for participants
• $2,400: $100 for locating each of 12 participants (2 sessions)
• $2,400: $100 stipend for each of 12 participants (2 sessions)
• $7,200: $100 for locating each of 6 participants (12 sessions)
• $7,200: $100 stipend for each of 6 participants (12 sessions)
Facilities • $4,000: $1,000 per day (4 days) • No cost: conference room at your office
Test preparation, administration, and report • $30,000: $15,000 for each session • No cost: internal

* The figures shown are based on years of experience. Most consultants charge $10,000 to $20,000 for usability testing, though I've recently heard of firms that charge both less and more than that.

Related Reading

More Insights

Currently we allow the following HTML tags in comments:

Single tags

These tags can be used alone and don't need an ending tag.

<br> Defines a single line break

<hr> Defines a horizontal line

Matching tags

These require an ending tag - e.g. <i>italic text</i>

<a> Defines an anchor

<b> Defines bold text

<big> Defines big text

<blockquote> Defines a long quotation

<caption> Defines a table caption

<cite> Defines a citation

<code> Defines computer code text

<em> Defines emphasized text

<fieldset> Defines a border around elements in a form

<h1> This is heading 1

<h2> This is heading 2

<h3> This is heading 3

<h4> This is heading 4

<h5> This is heading 5

<h6> This is heading 6

<i> Defines italic text

<p> Defines a paragraph

<pre> Defines preformatted text

<q> Defines a short quotation

<samp> Defines sample computer code text

<small> Defines small text

<span> Defines a section in a document

<s> Defines strikethrough text

<strike> Defines strikethrough text

<strong> Defines strong text

<sub> Defines subscripted text

<sup> Defines superscripted text

<u> Defines underlined text

Dr. Dobb's encourages readers to engage in spirited, healthy debate, including taking us to task. However, Dr. Dobb's moderates all comments posted to our site, and reserves the right to modify or remove any content that it determines to be derogatory, offensive, inflammatory, vulgar, irrelevant/off-topic, racist or obvious marketing or spam. Dr. Dobb's further reserves the right to disable the profile of any commenter participating in said activities.

Disqus Tips To upload an avatar photo, first complete your Disqus profile. | View the list of supported HTML tags you can use to style comments. | Please read our commenting policy.