Is usability obsolete?
Usability and HCD grew to prominence with the expansion of the web. While the roots of the field are much older, the growth has been significant in the past fifteen years: we’ve gone from invisible wonks to key “technology untanglers.” Unfortunately, our field has barely evolved in that timeframe. While the computing universe around us has shifted dramatically, we’ve clung to the same methods, advice and processes. Current usability work is an artifact of an earlier computer ecosystem, out of step with contemporary realities. Usability can no longer keep up with computing: the products are too complex, too pervasive, and too easy to build. These trends demonstrate the “new realities” that are making traditional usability difficult, if not irrelevant.
Products have become too complex
First, the idea of a “computer system” has evolved significantly, far beyond the standalone desktop systems of the past. Most first-world inhabitants interact with a vast, interconnected network of hosts, services, applications and platforms on a daily basis.
Take the example of a location-based service, like the “Urban Spoon” application for the iPhone. The application provides real-time restaurant recommendations based on proximity and ratings. There are at least six pieces of the Urban Spoon “enterprise”: three traditional UI’s (i.e., the purchasing “store,” the phone application, and the online rating interface), and three unseen infrastructure components (i.e., the quality of the information, the resolution of the GPS sensor, and the speed of the system). Even if the system were perfect in every other way, a major breakdown in any component would cripple the rest of system for users.
The current suite of usability methods is inadequate for this new, enterprise-reliant context. How do we accurately test the usability of an overall enterprise? Certainly, there is a narrow role for “sanity checking” each interface within the enterprise, but scripted usability testing forces the product into unnaturally small pieces, tested in series. Worse, it means you set yourself up to miss the larger design issues (e.g., Does actual context of use make this feature meaningless?). We see sophisticated clients who still focus their design on lab testing and scripted usability, when it’s not really appropriate for their networked environment.
With service-based, cooperative, enterprise applications, we’re limited to hacked-together usability methods (i.e., a series of narrow simulations) or rough design estimations (i.e., contextual studies) to try to understand the benefits and pitfalls. These enterprise challenges are only becoming more common: location-based services like Urban Spoon, social-networking applications like Facebook, or third-party platform sellers like Amazon. We need to replace our narrow usability methods with rich design tools that can address these types of enterprise-design challenges.
**Computing has become too pervasive **
As computing devices get smaller and more ubiquitous, they are used in a broadening array of contexts. For example, company email accounts were once reserved for in-office communication during the workday; now, with email-enabled phones, couples declare their bedroom a “Blackberry-free zone” to avoid the distraction.
While context-of-use has always been a part of usability, the variability is making our job much more difficult. In the past, we could study and simulate the anticipated context. Even if the context was outside the norm (e.g., an operating room rather than an office), the per-device variability was much lower. Most contexts of use today, however, live in the “long tail.” The increasing mobility and ubiquity of devices makes predicting the context of use, and thus its usability, the more difficult.
As the context varies significantly from user-to-user, or day-to-day, the homogeneity of usability testing becomes a poor proxy for real use. More importantly, the richest data may come from unexplored, niche contexts within the long tail (e.g., ice rinks, car repair shops, hospital waiting rooms). We need methods that can anticipate and account for unexpected and continuously changing contexts of use. And, we need to see the richness and variability as an opportunity for universal design.
Products are too easy to build
Introducing a product or service used to involve high costs and an inevitable production lag. As we all know, that cycle time has been shortened dramatically: downloaded applications are upgraded with service patches; new features appear, often without notice, within web applications; web services are refined continuously as errors are fixed. The deployment phase is essentially zero; design and development can become intensely iterative, with little additional cost.
This quick iteration cycle is a threat to traditional usability in two ways:
With such short development cycles, deployment can easily replace even the cheapest or most realistic testing. The beta prototype can simply be reworked until it is a marketplace success, without any formal usability or design. Adding design tasks, usability tests or contextual studies easily looks costly and unnecessary.
Second, the fast iteration cycles also reduce the focus on upfront design work, shifting the focus toward ongoing correction and revision. Unpopular design and usability issues can be pushed off indefinitely to be part of a “big redesign” that never materializes. Usability annoyances can be ignored until they become entrenched parts of the product and there is a decreasing pressure to “get it right.” Usability and design become add-on fixes or upgrades, rather than initial product drivers.
We need to make usability and design an integral part of the development process, at whatever pace it’s conducted. The fast pace of agile development and the constant deployment pressure must be embraced as an opportunity: for rich data, for iterative design cycles and for immediate answers.
What do we do now, knowing it’s terminal?
Given these three impending trends, the field of usability must change to survive. We can’t continue our practice on the current trajectory, pretending that the environment around us is static. While the core principles of usability are universal – active user involvement, iterative and multi-disciplinary design, appropriate pairing of users and technology – our techniques and methods need to catch up.
Looking more broadly, the usability community must find ways to embrace these trends, rather than hide from them. These trends also push us away from artificial testing and toward richer and more realistic data. The growing pains are hard, but if we capitalize on these trends, we can drastically improve the user experience and increase our market influence, by making us better predictors of user behavior, better advocates for true user needs and better critics of design work.
To our credit, there are glimmers of hope, where usability has shifted to address the new computing environment.
Revising our work to fit with agile: There are significant efforts toward “Agile Usability,” to address the challenges of rapid deployment. Online giants like Google and Amazon deploy and test design alternatives; groups like 37signals and A List Apart offer recommendations and guidelines for user-centered design within fast engineering cycles.
Breaking our labs into pieces: There are significant efforts to break down usability labs into smaller, configurable components. With lower hardware and software costs, companies are shifting toward the “lab-in-a-bag” model, where teams are dispatched with a prototype, an augmented laptop, and a video or still camera.
Using ubiquity to our advantage: Researchers are beginning to re-use artifacts from the “always on” culture for design purposes. In our work, we’ve used Flickr to identify what visitors photographed at tradeshows (i.e., to determine what content was engaging and what was ignored). Flickr provided insight into many different tradeshows, users, and patterns that would have been impossible with other methods.
Making our methods contextual: Guerilla methods continue to evolve, improving the data while reducing the overhead. There are evolving methods for “quick turnaround testing” (with a focus on speeding up the analysis process), “listening labs” (which employ contextual, user-driven tasks), plus a ton of revisions and extensions to paper prototyping. These lightweight methods are designed to fit into smaller timeframes, deal with looser requirements, or make the testing mobile.