Empathic research methods and design strategy
Getting beyond the quantitative/qualitative debate
Adam Silver, a Strategist at Frog Design, recently wrote an insightful article, "Calculated Design", in the company's online magazine -- design mind. I want to discuss the article because it touches on several key issues relating to innovation and designing products and services for the experience of users/customers. Adam notes that as globalization and digitalization emerged in the 1990s the trend resulted in product and service interfaces with more culturally diverse and geographically distributed audiences and a fragmented market. The combination of these forces led designers to search for new methods to augment artistic intuition. Considerations of form and function also required attention to feel, features, and interactivity attuned to the needs, wants, and beliefs of specific users/customers.
As Adam observes, ethnography was one of the first new methods incorporated by design research to meet these challenges in the market. However, he thinks ethnography is, on its own, unable to provide the kind of information needed to validate product and service ideas across wide audiences. He notes:
Ethnography breaks down at the moment we ask not just for depth of knowledge, but breadth. Anyone who's struggled to conduct a massive ethnographic study across multiple time zones can tell you this firsthand. While ethnography facilitates the generation of ideas in relation to specific users and use scenarios, it leaves us clueless as to which among these will satisfy a wider audience. Ultimately, we need complementary methods that scale more effectively and validate our work in a way clients can understand. What we need is quantitative research...
But how? Just as ethnography borrowed heavily from academia while applying a looser, more liberal lens, quantitative research can be similarly engaged. When individual observations can be contextualized within a data-driven knowledge of the market at hand, designers can have the best of both worlds. And there are many analytical tools that work well in this context. Segmentation analysis can be used to challenge thinking around current and prospective users, sorting consumers into salient, sometimes unexpected groups that hold together based on survey data - groups that defy traditional demographic segments can be linked by more relevant factors, such as behavioral patterns or attitudes towards technology.
Adam makes several very good points in his analysis of what quantitative methods can bring to design research. Though he recognizes the importance of sustaining a focus on users, I suggest that Adam's discussion does not give enough explicit recognition to the role of empathy in maintaining a productive relationship between qualitative and quantitative methods in research for experience design. Making methods serve empathic purpose in the design of products and services is a key underlying principle, regardless of the quantitative or qualitative nature of the techniques.
Consider the project example Adam offers involving a redesign of a corporate Intranet for a Fortune 500 company.
Without the ability to individually question the organization's hundreds of thousands of employees, spread across some thirty countries, we did the next best thing: we interviewed 10,000 of them online. We asked them what was wrong with their current Intranet experience. What did they love? What did they hate? How could things be better? We did an online "card sort" in which we asked users to prioritize the content that mattered to them, then posed a series of free-response questions, in which they could say whatever they wanted.
Without knowing the strategic purpose behind the project it is difficult to gauge whether a traditional ethnographic approach might have worked as well as the methods chosen. Adam's point seems to assume a need to interview all employees for ethnography to work well. I don't think most ethnographers would agree that the expectation is either realistic or necessary for the participant observer method to provide effective results. Adam's critique also seems misplaced unless the 10,000 online interviews resulted from random sampling, which he does not say. Regardless of the answer, I suggest that either approach can incorporate principled empathic consideration for the meaningful experiences of the users of the Intranet.
Adam obviously recognizes the point made in the last paragraph since the techniques taken at the front of the project, before applying quantitative analysis to the data, were informed by an empathic concern for those using the Intranet, though Adam doesn't explicitly take note of it. The questions the team asked tapped into the meaningful experience of those users, and the online "card sort" provided additional qualitative information. Whether participant observation with carefully chosen ethnographic subjects might produce the same insights is a fair question. After all, Adam urges design researchers not to get hung up on the academic roots and concerns of quantitative methods, such as sampling, reliability, validity, etc. And I really do agree with his point on that issue.
Once the Intranet project collected all the responses to its open-ended questions from the online survey, Adam notes that:
We then tapped a vendor to break this sentence-level data into quantitative codes, creating a massive tally for common response themes like "It's slow" or "I can't find what I'm looking for." Once we were able to look at this feedback quantitatively, common themes emerged...Some insights were limited to specific regions or business units, while others resonated with nearly all respondents, revealing the unique considerations of various user groups within the organization. Together, we synthesized 4,000 pages of tabs into fifteen slides, weaving in insights from secondary research, stakeholder interviews, industry best practices, and our own perspective to make a strategic recommendation to the client. When possible, we showed quantitative responses side-by-side with quotes from respondents to illustrate nuance and context while summarizing key themes.
It seems to me that the importance of the approach Adam advocates for using quantitative data in design research does not come down to traditional academic concerns about whether the results reliability predict this or that outcome from proposed design changes. Gerald Zaltman made a similar point in How Customers Think, noting that, "the various pieces of information that we gather through statistics, personal observations, and other data sources...are all stimuli that influence out thoughts, feelings, and behaviors. By viewing data in this way, most managers suddenly see the value of collecting multiple kinds of data" (p. 275).
Data don't speak for themselves, as much as stimulate meaningful conversations between design research, users/customers, and client management. Those conversations can result in interpretations that make a difference for the design of products and services offered by the business. In my reading, this is the main point offered in Adam's article. And it is a key point to make. I suggest that design practice works best when the research team doesn't choose between artistic and scientific techniques, whether qualitative or quantitative. The most effective design practice crafts a meaningful experience for users/customers from the tension between the two (art/science), and offers insights managers can relate to regarding the likely benefits for the business, e.g. ROI, time to market, product differentiation, cost containment, market share, etc.
Larry writes the blog Skilful Minds.