Monday, November 07, 2005
user research is about data
I am worried how methods are dictating design outcomes, without the involvement of user research. The signs of method-itis are often hard to detect, because many methods purport to infer what users want, without doing the donkey work of actually consulting users themselves.
User centered design is about looking at what users need and want, which comes from extensive research. No extensive research, no user centered design. But some people imagine because they want to help users, and because they are ever so empathic toward users, they are therefore user centric. They simply "know" what users want, through their own experiences, dissecting hypothetical issues, or walking in the shoes of users, imaging what the user would do. Somehow they forget the central issue: you can't know what users want except by actually looking at their behavior and preferences from all perspectives. People who think they know what users want without doing research have either worked on a very narrow issue too long, or are not very competent. The road to hell is paved with good intentions. The road to user centered design is paved with facts.
Data may seem lifeless, and a sideshow to the main story. Market researchers are often criticized for generating confusing data that doesn't point the way forward. Market researchers are frequently guilty of developing shallow data: surveys that don't answer why users behave as they do, and focus groups that don't answer what users actually do and truly need apart from a random collection of impressionistic feedback.
Useful user research, whether quantitative or qualitative, involves structure in collection and analysis. Unfortunately such research structure is often lacking in design approaches that take the end as the beginning (i.e., design to fit users to a preconceived activity, scenario, or use.) Frequently these approaches are based on a fictional person: an imagined typical user, or an imagined extreme user (an outlier case). Users weren't identified according to how representative they were, they weren't studied over enough time, or across enough variant circumstances to determine what themes are genuinely common and which are unique.
I am a big fan of possibilities of qualitative research, but I find that math phobes make the worst qualitative researchers, because they don't understand the notion of sampling and significance. One can be qualitative by doing a detailed structured sample of small group of people to probe inter-relationships, or light observation of a wide group of people to find common themes. But whatever the approach, it needs to be robust, ideally drawing on multiple perspectives. I highly recommend the books of DVL Smith and JH Fletcher on the relationship between qualitative and quantative research.
The major question any method needs to answer is: how do you know your conclusions are right? Unacceptable answers are that people say it sounds right when I tell them, or that other people who follow the same method I used reach the same conclusion. Acceptable answers are that you used multiple research techniques to search for disconfirming evidence, and that you tested design implications of your research conclusions through user testing.
User centered design is about looking at what users need and want, which comes from extensive research. No extensive research, no user centered design. But some people imagine because they want to help users, and because they are ever so empathic toward users, they are therefore user centric. They simply "know" what users want, through their own experiences, dissecting hypothetical issues, or walking in the shoes of users, imaging what the user would do. Somehow they forget the central issue: you can't know what users want except by actually looking at their behavior and preferences from all perspectives. People who think they know what users want without doing research have either worked on a very narrow issue too long, or are not very competent. The road to hell is paved with good intentions. The road to user centered design is paved with facts.
Data may seem lifeless, and a sideshow to the main story. Market researchers are often criticized for generating confusing data that doesn't point the way forward. Market researchers are frequently guilty of developing shallow data: surveys that don't answer why users behave as they do, and focus groups that don't answer what users actually do and truly need apart from a random collection of impressionistic feedback.
Useful user research, whether quantitative or qualitative, involves structure in collection and analysis. Unfortunately such research structure is often lacking in design approaches that take the end as the beginning (i.e., design to fit users to a preconceived activity, scenario, or use.) Frequently these approaches are based on a fictional person: an imagined typical user, or an imagined extreme user (an outlier case). Users weren't identified according to how representative they were, they weren't studied over enough time, or across enough variant circumstances to determine what themes are genuinely common and which are unique.
I am a big fan of possibilities of qualitative research, but I find that math phobes make the worst qualitative researchers, because they don't understand the notion of sampling and significance. One can be qualitative by doing a detailed structured sample of small group of people to probe inter-relationships, or light observation of a wide group of people to find common themes. But whatever the approach, it needs to be robust, ideally drawing on multiple perspectives. I highly recommend the books of DVL Smith and JH Fletcher on the relationship between qualitative and quantative research.
The major question any method needs to answer is: how do you know your conclusions are right? Unacceptable answers are that people say it sounds right when I tell them, or that other people who follow the same method I used reach the same conclusion. Acceptable answers are that you used multiple research techniques to search for disconfirming evidence, and that you tested design implications of your research conclusions through user testing.