UCD Crisis
There are too many methods of designing digital media. We currently have “agile” (hip, groovy) at one end and “waterfall” (a term of abuse) at the other. Each of our projects at LBi inhabits a space somewhere in between these two extremes at any one time – although because we’re an agency it’s mostly just different takes on waterfall. There have recently been some laudable attempts to be hip and groovy, although I’ve not yet had the pleasure of that myself.
From time to time my department (now close to fifty people I think) needs to vent a bit of excess energy (or hot air) in the form of periodic email discussions about industry tends, methods and related stuff. Some of this comes out on Stream, but mostly it’s by internal email. Today was a good example. Dan Saffer has written an article called Research Is a Method, Not a Methodology. This was duly discussed in fairly measured terms as Saffer makes some interesting points.
But then, I cracked.
I wondered whether we are too dogmatic about “user centred design.” Saffer puts this rather mildly, but I decided to turn up the volume on this idea to see who’d complain. The term “UCD” is now being used my many in the industry as an article of faith: a “methodology” with which to win pitches, allay fears and resolve problems. However, I don’t see an automatic connection between good design and the practice of putting users in front of designs and seeing what they do.
First though, let’s not be confused here – I’m all for user research. User research is a way, perhaps the only way, of getting a design direction in the absence of any other stimulus for that. I would say it certainly is important, and without some form of research before design, things will go awry. If I had a slogan for this, it would be “Design shall start with observation. Listen.”
What I am far less confident about is the practice of user testing, persona-led development and any of the other more brittle methods (like card sorting) that comprise a wider definition of UCD further into a project. I now wonder whether testing designs on users does in fact lead to better designs. I might even be saying I can sometimes see the opposite, although I not sure I’d go that far yet for lack of evidence.
What I’m saying hasn’t always been the case for me though. In the last eight years I’ve seen huge changes in the use of digital media. The web is almost completely unrecognisable compared to the calm, grey pages I vaguely recall marvelling at in 1993 (that big “N” straddling the globe in the corner….). At the risk of some disloyalty, I don’t mean this (launched today for Sony) as an example, but I do mean the stuff you’ll find here.
As the nature of digital media as evolved, one would suppose that people have become correspondingly sophisticated in their use of it. Indeed, that is exactly what I’ve seen. Not only that, but this sophistication seems to be lagging only months behind now, whereas before it was perhaps years. Two weeks ago, we tested 12 people, 8 of whom were above the age of 40 and four of those were over 50. Only two had problems navigating a site that I would have considered a challenge for such users a couple of years ago. As one of us at LBi said recently: what’s cutting edge now is hygiene tomorrow.
This, coupled with the fact that we’ve been in the business of digital media design for all this time, means that we have a good idea of what works (interaction wise that is) and what doesn’t work so well. When you consider that less than one in three of the projects I’ve been working on in the last couple of years has had a user testing component, this begins to say something about UCD. At LBi at least, we’re not looking at a two-tier output. The projects that have been designed on professional opinion alone are no less successful than those that have enjoyed the pixie dust of UCD.
So, here’s something that I’m toying with: UCD, as described by its boosters in my industry, has become a tool not of design but of professional defence. Experience design is hard. Unlike graphic design, you have to constantly justify your ideas in terms that are – ideally – quantifiable. We need something to stop the arguments: we need user testing. This is always quoted in discussions about UCD practice: it solves arguments. This is why I always feel much better when user testing is planned, because I know that it will validate my designs at least enough to keep the client on my side until launch. And I’m rarely disappointed now as I observe users dutifully using the prototypes pretty much as I’d expected. A few minor tweaks later, and we’re done (OK it’s not nearly that simple, but I have preserve some mystique!)
So, is that a good reason to spend a client’s money on a “UCD process”? I wonder. I need to think about this a bit more.
My experience from working in a small company:
We do not have extra money to spend on user testing.
So we “skip” this step;-)
But on the other hand, I believe we still create quite usable, useful and accessible web designs, all built with Webstandards in mind, semantic html/xhtml and well-organized CSS; also if JS is not available, or CSS is turned off, most of the features of the website remain accessible.
So this is what we try to do in every project…
My $ 0.02 :)
PS When I have a freelance project, I work the same way, without user testing. But I may show the website to a couple of friends and ask their opinion:)