Monday, September 12, 2005
does 'best practice' make us zealots?
A practical problem for most any religion dealing with a changing world is the presumption that a single truth exists, and that the truth is eternal. UCD practitioners may hesitate at the notion that they follow a faith, rather than represent the deepest enlightened reasoning that can be applied to a topic. But while reason does dominate the surface of discussion, deep philosophical assumptions shape the pattern of dialogue UCD practitioners engage in with others. These assumptions are rarely questioned, they are articles of faith, which (strangely!) aren't accepted wholesale by people outside the UCD flock.
One of the biggest sacred cows in the UCD world is that user centered research and iterative design must begin from the inception of an application. This idea reached the status of canon law by the early 1990s. The logic of the law seems unassailable: by involving users from the start, you get the application on the right path immediately, and you save time and money in the process. Indeed, early and constant user involvement does exactly those things in certain circumstances: when you build an application from scratch, and create it to fit the specific needs of target group. Unfortunately, those circumstances are getting less and less common.
When the law of "UCD should drive application development" came into being, the software development world was radically different than it is today. At that time, applications were developed one by one, often using procedural programming. Then modular software, object oriented programming, off-the-shelf enterprise applications, and a raft of other software development innovations burst on the scene in the 1990s. Software vendors discovered the key to reducing software costs was standardization and re-use. Vendors figured they could maximize return if they could re-use code they had already created from another application. If it wasn't exactly what they user needed, so be it: at least it was cheaper.
Once standardization and re-use dominated the software development world, UCD lost the argument that it is most cost-effective to start with user requirements before writing any code. Paradoxically, because of the rise of the Web, UCD practitioners were largely unaware that the UCD development cycle was becoming marginalized in application development. The Web demanded comparatively little input from application developers, giving usability a freer reign. Involving users from the start of the Web project worked easily. Until recently, Web sites have been shallow, involving a fairly dumb interface attached to some fairly basic databases at the back-end. Because the amount of code in Web sites was not overwhelming, modifications based on user research and iterative design could be accommodated without too much a kerfuffle.
The transition to Web applications has given application developers much more power. UCD is marginal when it comes to influencing many of the key decisions that affect usability, such as what CRM system will be used to handle customer data. These decisions are made by IT systems people and business analysts, generally with little involvement from UCD practitioners. But it is these key back-end elements that are shaping the user experience, because they define what the user can and cannot do with a Web app.
When we step back a step, we find even less UCD involvement in the development of enterprise applications that power Web apps. These applications involve off-the-shelf components that drive almost any conceivable aspect of an organization, from personnel to scheduling to customer service. Despite the enormous impact of enterprise applications on users both inside and outside of organizations, UCD practitioners generally have not been involved with shaping what these applications do, or how they do it. By and large, enterprise applications are created in the imaginations of business operations specialists, with hardly any user input. As an afterthought, users are brought in to test the development of the interface. The consequences of the situation are astounding: UCD has no involvement in the development of software representing billions of dollars of spending by corporations.
The usability profession to a large extent doesn't even understand how marginal it has become. I hear some usability people talk about working on their company Intranet, as though it where just another Web site. Their involvement is limited to rearranging chairs: a bit of information architecture work, and figuring where to put links, based on the specifications presented to them. But UCD did not drive the development of the application, in violation of the sacred laws of UCD. The application already existed. It was bought from SAP or Oracle or IBM, specified by an in-house BA and modified by systems people from a implementation partner.
I am not convinced many usability practitioners are necessarily interested in the mysterious working of the applications. They are happy to stick to the interface, where the issues are far easier to notice and deal with. Applications also involve dealing with an alien culture, people who complain about changes and don't seem excited when talking about user experience.
Some UCD practitioners do realize that users aren't going to gain proper value from an application unless the application reflects their prorities accurately. Many enterprise applications fail users and need extensive change. But UCD orthodoxy hardly equips us with a road map on how to fix the problem. If you follow the UCD script, you end up saying: "Well, see, the problem is that you did this all wrong from the start. Before you did anything, you should have called me in to do user research, then we could have designed a special application to meet the unique needs of the user population." Anyone hearing that would assume you were arrogant and obstructionist.
It is time to reality-check notions that application development must start with UCD. I empathize with these sentiments completely, but I question their practicality in today's business world. UCD has increasingly advocated as best practice the development of bespoke solutions for users. Meanwhile, in the hard-nosed world where money talks, businesses have embraced off-the-shelf solutions for a variety of reasons related to purchase cost and maintainability. The philosophical differences between the approaches couldn't be wider. The question is, are there practical solutions to provide a way for both sides to work constructively toward common goals without compromising their priorities?
I have been developing a framework I am calling "context-driven application re-engineering" to assess the costs to businesses of poorly functioning enterprise applications, while providing a pathway to making those application work better for users. To get the needs of users represented in the reality of off-the-shelf enterprise applications today, I commit heresy by dropping the "do it right from the start" attitude of the UCD profession. We need to recognize how marginal we are becoming, and face this reality if we hope to change it.