Wednesday, August 31, 2005


services as commodities

What do people want from services? We live in an increasingly post-material age, where services are the dominant "stuff" we consume (if we look at our spending). There is a lot of money to be made for companies that offer services they way people want them. The questions is, what matters most to people? Do we want services to be special, satisfying, cheap, reliable, flexible...?

Designers are beginning to look at "service design" as a design discipline, focusing on making services special (experiential), user-responsive, and coherent. The logic of service as a design discipline holds that if people spend a lot on services, they surely must want to obtain a unique experience for their spending.

Business, it would seem, have little enthusiasm for considering service a delicate object, to be dressed up into something special. In the view of business, service is an economic liability that needs to be tamed.

IBM is leading the charge. Paul Horn, SVP for research at IBM: "The next big thing is in the general area of something called services science. It's the componentisation of business."

Note the metaphor: Service should be a science (not a craft, a discipline or an art.)

Horn notes: "Services science would merge technology with an understanding of business processes and organization, a combination of recognizing a company's pain points and the tools that can be applied to correct them. To thrive in this environment, an IT-services expert will need to understand how that capability can be delivered in an efficient and profitable way, how the services should be designed, and how to measure their effectiveness."

IBM's website notes: "industrial and academic research facilities need to apply more scientific rigor to the practices of services, such as finding better ways to use mathematical optimization to increase productivity and efficiency on demand."

IBM is enlisting academic collaborators to unlock an understanding of how to achieve efficiency in services. One IBM collaborator, Arizona State University, proudly boosts that services aren't about "platitudes", its about now about science.

Two visions of services. Both addressing a concern of people. Is the notion of service as experience enhancement in opposition to the notion of service cost reduction? Possibly, but not necessarily. Both designers and business people need to work the the ambiguity of services. People are contradictory: they spend great sums on services, and want value, but still want to be treated like a king. Some creative thinking is necessary to make that possible, but I imagine it is possible.

Tuesday, August 30, 2005



I have been searching for a system to handle my massive PDF collection, and may have found a solution that works. A couple months ago I bought a Mac mini, intrigued by the ability of "Spotlight" search engine to chew through PDFs. I have loaded over a thousand PDFs on the Mac mini, and am pleased to be able to locate files I haven't seen in a while.

Overall, Tiger offers some valuable capabilities. At the same time, I sometimes feel like I am trying to buy a necklace but instead have been given a needle, thread, and bowl of popcorn and told to make my own.

Monday, August 29, 2005


daft downloading

Seems I am forever having to visit websites for "updates" of software. Even though some software will download updates automatically (assuming my spyware and virus software allow even that), there still seems no escaping the laborious visits to a website to get updates necessarily to keep things from crashing. Two things that really annoy me:

Item One: You download the "latest" version of the software, and install it, and reboot your computer, only to be told, as you boot up the lastest version of the software, that there are updates available. You think: I was just on the site, spent 10 minutes going through countless screens clicking "next" and "accept", and I still don't have an up-to-date version of the software. Guilty: Adobe Acrobat Reader.

Item Two: Your software crashes, and you are sent to the website for an "update" (really just a patch for the buggy software). You download the update, and install it. The vendor has the bad manners to ask you to read and agree to an "End User license Agreement" for the patch, as though it represented some big favor for which the user must offer thanks and humility. Guilty: Microsoft. Why some updates can be downloaded automatically, but others require an EULA, is something only lawyers can understand.

Thursday, August 25, 2005


"no touch" service: there is no escaping context

Software is increasingly making some of the most important decisions affecting your life: how you receive medical treatment, if you get a mortgage, or if you qualify for insurance. According to IT guru Thomas Davenport in the current issue of the MIT Sloan Management Review, "decision making automation" is no longer a pipe dream, it is "coming of age."

Automated decision making capabilities "are embedded into the normal flow of work and they are typically triggered without human intervention." Davenport writes that the goal of businesses is to have processes executed with "no touch" treatment -- no human intervention required. Currently, some "exceptions" to automated rules require human attention, but businesses are working to eliminate these altogether.

One can understand the motivation of businesses to reduce costs through automation, but there is a cost to users/customers that isn't pleasant. For example, while generally positive about the possibilities of automated decision making, Davenport concedes that hospital managers punish doctors and nurses for "overriding" automated decisions.

Automated decision making can run roughshod over users and customers. Corporate managers need to think carefully about the stakes involved. Davenport mentions managers need to define the context and limitations for automated decision making, but he doesn't specify how that is done, other than to caution firms not to create an application that will get one sued for malfeasance.

Too often, automated decision systems are implemented without enough thought to context and limitations. Decisions involving people, by definition, involve many variables, because people are endlessly complex. If you design a mortgage decision application, you can probably get a good approximation of who qualifies for a loan into a software program. But it is only an approximation. We have all heard funny stories about qualified people turned down for something because they didn't meet a rigid rule, even though common sense told you the rules shouldn't be a constraint in their case.

Automated software can't cope with unanticipated or uncommon exceptions. Common sense can cope with such exceptions, often easily. But it is difficult to capture common sense in "knowledge management" software. It involves too much tacit knowledge, often from outside the immediate work domain.

Today, knowledge engineers are draining the brains of people, trying to codify how they think. The result, even if credible, are often brittle. Circumstances are always changing, but software decision engines can't necessarily adapt to these changes. Davenport notes:

As the ranks of employees in lower-level jobs gets thinner, companies might find it increasingly difficult to find people with the right kinds of skill and experience to maintain the next wave of automated decision systems.
Imagine how unresponsive industrialized service may become. Suppose someone with an emergency is stranded in an airport, due to a plane delay. Software reassigns people to alternate flights, using tested rules of fare paid for the ticket, and loyalty evidenced by frequent flyer miles. The software makes no allowance for the person in an emergency, and the person may not even be able to talk to someone about his situation.

When automation of decisions affecting peoples lives goes too far, it will create some nonsensical situations. The aggrieved people whose situations where not considered by the requirements engineers will be very, very mad. Compared to common sense, software will never look so stupid. Companies will never look so uncaring.

Companies that don't want to be humiliated, or sued, need to do some deep contextual research if they are implementing automated decision systems that have the potential to be more than a mere annoyance to customers.

Tuesday, August 23, 2005


warren buffet on usability

"Start out with failure then engineer its removal."

Warren Buffet, speech at Emory Business School, 1991

Monday, August 22, 2005


convergence and requisite variety

What do you do when a video call interrupts your viewing of a streaming news clip that you are watching on your mobile phone? I faced this situation recently while trying out a 3G handset. I learned that the connection for the news clip stream got broken.

As network operators and handset makers cram more functionality into a device, users face a new kind of challenge. While device complexification is almost an iron-clad law of electronics, recent developments in mobile devices up the stakes. Unlike with standalone devices, mobile users don't have the option to ignore the stuff they don't understand how to use. Network connectivity forces users to deal with interruptions from others who want to communicate in an increasing range of ways.

Multifunctional mobile devices, and embedded pervasive computing, point to a world where users will have interaction foist upon them, ready or not. The "law of requisite variety" suggested by cybernetic theorist Ross Ashby in the early 1950s is highly relevant. The law states that users need at least as many kinds of controllers available to them as variety of situtations they need to control. For some reason requisite variety has received scant attention in usability and HCI literature. One reason may be that traditional HCI has viewed the user as initiating interaction with a machine, rather than having situations thrust on him or her.

One of the earliest thinkers on the topic of convergence was the late NEC chairman Koji Kobayashi. Twenty years ago he wrote a book called Computers and Communications where he argued users would someday use phones tell computers to act on their behalf, doing smart work like translating conversations in real time. Kobayashi's answer to requisite variety was to hide it from users and have computers deal with the complexity presented by convergence.

Kobayashi penned his vision during the expansive 1980s, when Japan was funding 5th generation computing research aimed at creating rational AI machines. We are still far from realizing that vision.

For now, the chore of managing the complexity associated with convergence falls on the user. Requisite variety tells us there is no escaping the need to give the user ways to manage the complexity. Previously, rich functionality was something power users discovered, and mainstream users ignored. Now, power users, when contacting mainstream users, will force the mainstream to confront functionality they may have had no intrinsic interest in discovering. The need for good usability has never been greater.

Saturday, August 20, 2005


democracy is hard work

Ever since the infamous "butterfly ballot" in the US 2000 presidential election, usability professionals have taken a keen interest in elections.

New Zealand will hold parliamentary elections next month, and I have been interested to read a number of people mention that many voters are "confused" about how the NZ voting system works in practice. The confusion rests not with a ballot per se, but how the ballots are counted and what consequences result.

About a decade ago, New Zealand adopted a voting system known as MMP, which gives people two votes, one for a member of parliament, another for a political party. After the last election, a government commissioned survey (2003) found that only half of voters understood that the party vote determines the composition of parliament.

A Recent survey shows understanding is improving, but still falling short of what is ideal.

From the perspective of usability, MMP can be considered according the following criteria found in ISO 9241:

I'm not sure any voting system would satisfy all the above criteria, which were never meant to evaluate voting systems, but it is interesting to ask the questions. Democracy is hard work.


mobile applications: just-in-time, not simply anytime

As of last week New Zealand has two 3G networks now running, so in theory we may see development of nifty applications that will make life richer and more productive while on the go. But I am cautious this will happen. The applications and content I have seen so far for 3G looks like it is designed to kill time, rather than make time more productive.

Mobile applications promise to deliver "anytime, anywhere." But people don't just want anything available at anytime at any random place they happen to be. What they want is specific content relevant to specific needs in specific circumstances.

One big 3G offering is video clips of movie trailers. I can see scenarios where that could be useful to have on a phone. Say you were with a group of people in a restaurant, deciding what movie to see. One person speaks enthusiastically about a film she saw a preview for, but others don't know about it. Download the trailer, and everyone can see it and decide if they want to go. So far, so good. But the decision is only half made. The group needs to find a cinema showing the film, get the schedule, and most importantly get tickets, which might be sold out. The entire process is wasted if after making a decision, and finding a nearby cinema showing at a convenient time, the film is sold out.

I'm a last minute planner, and get frustrated when trying to do seemingly simple things like eat a meal or watch a film on short notice. In Wellington at least, I can get an odd stare when I ask if there are tables available at an ordinary restaurant. Did I make a booking? Not me, I think about food when I'm hungry. I want to call up a list of restaurants near where I am, and see which of them have tables available. Why can't 3G help me avoid being a hungry nomad wandering from restaurant to restaurant looking for one with a vacant table? Such an application would tap into the power of "presence": showing one's availability to interact with others. A restaurant could indicate it has tables available, if it could dare to disclose the fact that some nights it is not fully booked. Generally businesses like to hide information about the availability of supply, and create the illusion that supply is scarce. But eBay, Travelocity and other online markets show that consumers want to know availability.

So far, even "location-based services" around the world are few in number; New Zealand has none at all. Location-based services are fine for finding a 24 hour ATM/cash point, but generally don't go far enough. What is needed is time-based, location-based services. People want to know what they can do nearby, at this very time.

Time-killing mobile phone applications like games and music are fine for teens fleeing their parents' house. Time-enhancing applications are what the rest of us will need to embrace 3G.

Thursday, August 11, 2005


is work boredom an existential inevitability?

Today's Washington Post contains an article entitled Boredom Numbs the Work World. A cynic might comment, so what else is new? Isn't this just filler for a slow news day, as those living in the northern hemisphere are preoccupied with thoughts of summer holidays? But I would encourage the cynics to think a bit more about something so endemic it escapes serious discussion. The Post article notes:

boredom is a condition that can be more stressful and damaging than overwork, according to those who have studied the issue...a lack of autonomy and a job that has very specific instructions -- hits workers from the highest to lowest echelons of the working world
I don't have good statistics to cite, but I would argue that work-related boredom is increasing, as sociological and business trends move in opposite directions. Shoshana Zuboff and James Maxmim note in The Support Economy that the global rise of "psychological self-determination" is clashing with the relentless industrial logic of resource reduction, which seeks to control the scope of employee work with greater precision. As people become increasingly educated and socially independent they want more mental stimulation and latitude in their working lives. At the same time companies of all kinds want to make jobs more efficient and predictable, reducing work to a formula. Even though some companies have attempted to introduce "quality of work life" programs, these have failed for a multitude of reasons, mostly because they implicitly or explicitly undermine the authority of management.

In America at least, the corporate response to boredom has been to treat it as an attitude problem of employees. Zuboff and Maxmim cite a skills survey showing 80% of US employers "emphasized the importance of workers' attitudes and work ethic, while only 5% emphasized the cognitive abilities and growing skill demands." The message seems to be: leave the thinking to us, just smile and do as told. True, employees make decisions and are accountable, but they often have no real authority to make creative choices outside of established procedures.

While attitude can play a role in coping with job boredom, it is not a reliable strategy for addressing the sense of powerlessness many feel. Even the most devoted Medieval monks, who strived to be pious employees, suffered from a condition known as acedia, or spiritual burnout. Attitude approaches often led to burnout and more stress, as the cognitive dissonance of the boring reality one experiences clashes with how one is supposed to imagine that reality.

Rather than try to redesign people to fit a job, what is needed is to redesign jobs to fit the psychological needs of people. Human centered design needs to be applied to all kinds of work.

The Post article mentioned how airport x-ray scanning personnel are rotated every 30 minutes so they can maintain concentration when watching for dangerous articles. Such an approach is a small example of how human factors approaches can be applied to reduce boredom. For safety-critical work, removing boredom is not just something nice to do for the sake of workers, it is essential. But these approaches can be applied to all kinds of work, not just safety-critical ones.

An obscure discipline called macroergonomics is looking at how job boredom is a serious productivity issue. (Macroergonomics is the study of the design of work systems to fit the physical and socio-cognitive needs of individuals. Organizational psychology, in contrast, typically looks at more fuzzy issues such as organizational climate.)

Mitsuo Nagamachi at Hiroshima International University, for example, has done creative work on how jobs can be redesigned to improve employee satisfaction and productivity. The effects of monotony are not just subjective: a brain's EEG frequency differs when doing monotonous tasks compared with complex ones. In one example, Nagamachi describes how the employee union at a Japanese department store volunteered, and fought hard, to be allowed to increase the complexity and discretion of employee work, with a net effect that productivity increased. The transformation required the store to train everyone in merchandise ordering, when the function had previously be done by management. According to the logic of resource reduction, a centralized ordering function is more efficient. But devolving responsibility had spill-over effects not predicted by traditional efficiency analysis.

Much of the work looking at job redesign and productivity is done in places like Japan and Scandinavia, where employment relationships are longer-lasting than in much of the rest of the world. When employees work for the same employer for many years, it makes more sense for employers to be concerned with employee welfare, and to take risks to invest in new approaches with long term payoffs. But overall, the trend worldwide is for job relationships to be shorter. This shortening of employment is a consequence of the "psychological self-determination" mentioned by Zuboff and Maxmin (employees want the freedom to change jobs), and of the business logic of resource reduction, whereby companies want to minimize costs by keeping staffing flexible.

The needs of individual workers and the companies that hire them are moving in opposite directions, with dim consequences for both. Workers change jobs often, partly to flee boring work. Companies strive to make jobs as simple as possible, so they can reduce the expensive knowledge and training necessary to do the work. The more employee turnover a company experiences, the more it tries to simplify the work, so it can make new hires instantly productive. But at the same time, the simpler the job, the more boring it is, and more likely people will leave. From the employee perspective, the more inclined he or she is to "vote" with her feet and seek new thrills in new jobs, the less likely employers want to invest in employees to make their work more stimulating. Even at the highest levels of organizations, job turnover can be high (just look at the six-month tenures of many CEOs.) Employee turnover forces organizations to worry about job continuity, which encourages rigid work systems.

How to resolve the conundrum of job boredom is not obvious. But it must be addressed, and I am optimistic can be. When one asks business people what keeps them up at night, it often is the worry they lack the creative nous to survive the crushing pressure of competition. Let's hope the need of organizations to reinvent themselves will force them to confront how to make work interesting. Doing so will stimulate the mental energy of employees to develop creative solutions to competitive challenges.

Wednesday, August 10, 2005


trend to watch: usage studies

A very interesting recent development has been the marriage of device-use logging data with qualitative methods to develop a more complete and rich view of situated data. Earlier this year CHI held a panel, "Usage Analysis: Combining Logging and Qualitative Methods." Think of it as updating the unobtrusive measures approach pioneered by Eugene Webb some 40 years ago.

The approach brings grounding to observational methods. Observational methods are resource intensive, which means that "exploratory" observational research is often not efficient. Rather than looking at everything, suppose one could focus observation on specific issues that you know seem peculiar, and you would like to know more? Log data might offer guidance for focusing research.

Analyzing log data has become standard practice for web sites, but the possibilities are far broader. Server logs exist not just for web sites, but other server-hosted applications as well, which can be an interesting window into the behavior of remote users, especially mobile ones. Client-side logging would seem to generate too much data for sustained exploratory analysis. But one can choose to automatically log only certain events, if one is interested in looking at a particular behavior or application. Something as simple as a cell phone call registry contains potentially valuable information. In addition to server and client based information, the telecommunication network captures interesting usage information, such as who was called, duration of call, or data file sizes. Such billing information can be easily mined for patterns. The information from all these sources can reveal screen interaction data, user activity data (What applications does the person use? At what locations was the person?), and social interaction data (With whom does the user communicate?).

One powerful benefit of log data is that it is time-stamped. One gains insight into how people spend their time. One can also correlate different events, to examine how people juggle tasks, deal with interruptions, or work with colleagues. I can even imagine tying log data to other time-stamped information. Suppose you had a networked self-service kiosk in a public space. You notice from server logs unusually heavy usage at a certain time. You might be able to review closed circuit video of that same time to see how customers were dealing with situation.

As devices continue to gain functionality (GPS, video, etc.) the potential for log data will expand even more. Such data can provide a "big picture" view of behavior, and also provide a springboard for follow-up contextual research.

Monday, August 08, 2005


we've noticed that customers who...

I am starting to get too much spam from Amazon. Typically, the email says "we've notice that customers who have bought X have also bought Y." Lovely for them. What the hell does that have to do with me? Very little, it turns out.

Last week I get a herd-mentality recommendation for a new jazz album. Amazon thoughtfully notice I bought a 1962-issue jazz album some three years ago, and assumes I am interested in some contemporary jazz performer I have never heard of who has just issued an album. Problem is, I generally hate contemporary jazz. Out of curiosity I previewed the album but could find no stylistic relationship with my earlier purchase. Frankly, I doubt that a data correspondence even exists. The page for the newly released album listed 18 other "similar" albums, none remotely in the 1960s jazz vintage.

Today, another email from my faithful correspondents at Amazon. They have "noticed" that people who bought a certain book are pre-ordering a new book by the same author. This time the new book's web page does list the previous book as "similar". How similar? They are the same book: the soon-to-be-released simply has a different title and publisher, and will be in paperback instead of hardback. Are scores of people so stupid not to guess they are the same book, or is the data matching so primitive that it assumes people will buy anything from the same author?

Thursday, August 04, 2005


"what adapts? technology or people?"

Don Norman asks a very important question: do people adapt to technology, or does technology adapt to people? He points out that people can and do adapt to klutzy (unnatural) technology. He cites three examples of unnatural technology people happily use: the clock, writing systems, and musical instruments.

While Norman chose his examples because they are ordinary, they are also very old technologies. All of them developed before anyone thought to use a human-centered approach to design. And all of them involved many centuries of effort by humankind to master them. While successful, these unnatural technologies were hardly instantly successful.

Let's take musical instruments. Norman notes that repetitive stress is a common problem associated with playing many instruments such as violins. Despite this, people play violins. Therefore people have adapted to the violin. Well, not quite. They still get repetitive stress. If they had fully adapted, then repetitive stress would be a non-issue. If humans could adapt fully, people might need to do some special time-consuming exercises to conquer repetitive stress, but the inefficient work would prevent injury. But there is no fool-proof way to avoid repetitive stress injury.

The violin is also hard to play (I know from childhood experience.) If people could adapt easily to the demands of the violin, then there would be no need for Suzuki schools and similar punishments inflicted on children.

The design of the violin remains unchanged because people choose to suffer. Suffering is part of the prestige of playing a violin (prestige I am happy to forego.) If the violin were easy, would anyone bother?

The guitar is supposed to be easy (relative to the violin at least.) People interested in taking up the guitar are not generally masochistic, at least in the beginning. Not surprisingly, the guitar development has benefited from human centered design. Kim Vincente tells a wonderful story about how the Flender Stratocaster guitar was constantly redesigned based on feedback from musicians. Leo Fender made it easier to handle, more comfortable, and easier to play. The electric guitar was developed using human centered design principles. Vincente argues that without those improvements, rock music (Beatles, Stones, Who...) may have never taken off like it did.

So I agree with Norman that people can adapt to technology, but it hardly follows they want to adapt. Hundreds of millions use computer keyboards, but millions of people suffer repetitive stress from keyboarding. PCs are 25 year old, but we haven't adapted to the keyboard successfully.

Tuesday, August 02, 2005


branding and government: a pointless combination

I'm not one of those "anti-government" people who whines about there being too much government, or complains that all government is incompetent. Actually, I think the concept of government gets a bad rap. I accept government as necessary for a well-functioning society. It may not always work as well as we would like, it can often be improved, but one needs to accept government for what it is, and not pretend it is a corporation.

While I accept the value of government, I also tend to limit my interaction with it where possible. I don't, for example, surf government websites for the pleasure of it. I consult a government website only when I have some unavoidable need, like getting my car registered, or paying taxes. But somehow these simple certainties of life has become more uncertain, thanks to the pernicious influence of corporate branding.

I recently had to do my UK taxes, and found I needed a certain form. I googled "Inland Revenue" and UK, but only got a slew of websites belonging to tax preparers and accountants. To where had Inland Revenue disappeared? They had sent me mail only a few weeks ago. Further hunting revealed something called HM Revenue and Customs. I discovered that HM Revenue and Customs is the new, re-branded Inland Revenue. To my ears, the new organization sounds like its main mission is to collect fines from people bringing too much booze into Heathrow.

A similar frustrating experience happened when I needed to get my car registration updated. The responsible government department in New Zealand had decided to rebrand, a process that would take several months. In the interim, I get mail with the both the old and new name on it, depending on what stationery is being used. Being recently arrived to New Zealand, I am confused which is the old name and which the new one, and don't know what name will be listed in the phone book (when did the name change happen, and when was the phone book printed?)

However trendy discussion of "lovemarks" may be in the business world, I don't think they are appropriate for government. A wise person, maybe it was Steve Krug, said nearly a decade ago that users don't want to know your organization chart. In the early days of the web, many companies structured their websites around their internal functional department, instead of structuring information from a customer-centric perspective.

The last thing I want to do when paying my taxes or registering my car is to spend time figuring out what part of government is responsible for it. Perhaps well-meaning government officials believe that sending me announcements of an impending name change will help me learn who to contact. But it would be far simpler if governments did not change their names, ever, even when they do a bureaucratic merger or shuffle. And if you collect taxes, call yourself the tax department, not some euphemism like "revenue." Revenue from what? Sales of lottery tickets?

I am encouraged by the approach in the UK of Directgov, which offers a customer-centric view of government services. People don't care what the name of a department is that processes a form, they simply want to complete the form and have it acted on.

This page is powered by Blogger. Isn't yours?