Studying History: How Westward Expansion Still Inspires Small Businesses
How the Legacy of Westward Expansion Continues to Inspire Small Business Owners
Read more...I’ve been a bit of a pizza snob ever since my first job making pizza at the age of 15. I’ve always had a pretty low opinion of Domino’s Pizza®, so I was very impressed when I saw Domino’s recent efforts to reinvent both their pizza and their brand identity by tackling their problems head on. When I first saw their ads and online video I was amazed that they had chosen to use actual user feedback that denigrated their product. It’s extremely rare to see companies admit their own faults so bluntly and publicly, but the best way to deal with the elephant in the room is to acknowledge it, confront it, and ultimately overcome it. We have yet to see whether Domino’s can successfully overcome their negative brand perception, but the message they’ve sent has been enough to motivate me—someone who had previously sworn never to eat Domino’s Pizza again—to give them another try.
Their rebranding effort has brought up quite a few memories for me. I’ve seen the pain on the faces of designers and engineers when users eviscerate the products they’ve spent months or years developing. I also know convincing stakeholders they need to rethink a product can be a painful and costly endeavor. However, one of my most salient memories involves interacting with research participants who had their minds made up about the quality of a product before they ever laid eyes on it. Such preconceived notions can be another elephant in the room—a barrier to achieving accurate and actionable feedback on a concept or design.
The Angry Elephant
A short time ago, my business partner and I were giving a talk about communication and research at an ecommerce company, and someone asked a very interesting question. What was our most difficult interaction with a research participant? The question instantly brought to mind this memory: Some time ago, we were asked to conduct a focus-group session with a specialized group of participants. In this particular case, the focus-group participants had previously tested builds of the product, and their feedback had been overwhelmingly negative. Our client, to their credit, had already decided to redesign the product from the ground up and realized they needed direction. The company had chosen to bring in a fresh research team, so they hired our company to engage users, find out exactly what was wrong with the current design, and provide guidance on how they could redesign the product.
I’ll never forget the experience of walking into the room where the focus group was to take place to greet this group of participants and being met with glares bordering on open hostility. We explained that we were performing research the company intended to use to improve the design of the product, but that wasn’t anything they hadn’t heard before, and they were understandably skeptical. Our initial attempts to extract meaningful information fell flat as the participants just spat out short comments, then quickly clammed up. It was obvious we would have to win them over and show them we were on their side. Vague phrases such as we recognize that there are problems with the design were not enough to assuage their doubts. To show them we were truly ready to make serious changes, we had to do what Domino’s has done in their recent advertisements: be brutally honest about the failings of the existing design. So, after going through an extemporaneous list of the product’s design shortcomings, we were finally able to show the participants that we understood their pain, cared about their feedback, and were ready to make changes.
This kind of experience interacting with user research participants emphasizes two essential parts of successfully communicating with participants: establishing objectivity and building alliances.
Establishing Objectivity
When engaging with user research participants, it’s essential that you put some distance between yourself and the product concept or design. If you have any biases or personal connections with the product, participant may feel uncomfortable giving you negative feedback, because they unconsciously sense your disappointment. For this reason, we commonly advise that designers not lead user research for products they’ve designed. If a dedicated user researcher is not an option, we typically advise companies to use a different designer who has no personal attachment to the project. When we are interacting with participants, very early on in our conversations with them, we make a point of letting them know that we did not design the product, so anything they might say will not hurt our feelings. This gives participants the freedom to offer negative feedback without the social stigma of being mean or hurtful.
Building Alliances
When taking on the role of user researcher—whether you are actually a user researcher or are a designer, engineer, product manager, or in an another profession—it is important to take to heart that, for that period of time, you are on the side of the users. A user researcher is a user advocate. It is his or her job to prioritize the needs of users, even if it creates additional difficulties for the product team. In doing this, a researcher builds alliances with participants, so participants feel that the researcher—and thus, the company—values their feedback and will use it to influence the design of the product. While establishing objectivity can help participants feel free to speak openly, building alliances can motivate them to participate actively in a user research session. It’s another strategy you can use to maximize the amount of actionable user feedback you’ll get from a research session.
The Happy Elephant
Of course, preconceived notions about the quality of a product can also swing in the other direction. A company that enjoys a strongly positive brand perception may have difficulty getting objective feedback about a product, because participants have already made up their minds to love the product before they even lay eyes on it. We’ve tested products with participants who were brand loyalists and overlooked numerous faults in a product’s design. In one particular instance, a participant had difficulty performing a majority of the tasks during a usability test, but reported that he loved the design, so we shouldn’t change a thing. It was only through our pointing out and inquiring about specific areas of difficulty he had encountered that he finally admitted that there were aspects of the user interface that should change. There are two methods that come to mind for dealing with this type of issue: anonymous testing and prioritizing objective data.
Anonymous Testing
The easiest way to overcome such brand bias—whether participants have positive or negative perceptions of a company’s brands—is to simply avoid it entirely. If it is possible to conceal the identity of the company developing the product, it can be useful to do so. Often, by not associating a brand with a product or concept, we can acquire a more accurate understanding of users’ reactions to the product itself. As consultants, we can accomplish this by performing user research under our own company name. We can bring participants to our offices and use prototypes and builds that show no company logos. It can be much more difficult to establish anonymity if we are testing at our clients’ offices, but, of course, it isn’t always necessary to do so. We do usually suggest some type of anonymous testing when we’re working with companies that are trying to overcome a strong negative or positive brand perception. However, anonymous testing isn’t necessary for companies that are less well-known or have a more neutral brand perception.
Prioritizing Objective Data
Objective data refers to information user researchers collect through direct observation of participants, while subjective data refers to data researchers collect through participants’ verbal reports. In cases where you do encounter real brand advocates, it is important to incorporate objective data into your research. For example, during a usability test, we note specific instances in which a participant has difficulty with a device or user interface. Then, when interviewing the participant, we typically make statements like this: It’s great that you loved the Web site, but you seemed to have difficulty doing XYZ. To which a participant usually responds by acknowledging the difficulty, then making comments about their expectations or suggestions for improving the user interface. There are also ways to objectively document emotional responses such as frustration, disgust, and happiness. Learn to recognize such emotional cues, document them, and inquire about them when doing any type of product research, including generative research such as ethnography. Typically, objective data is more accurate than subjective data, so it is important to prioritize data accordingly when findings are inconsistent.
Conclusion
Participant biases, both positive and negative, can get in the way of obtaining useful feedback when performing concept or usability testing. There are a number of ways to work around these biases, as follows:
· Establish objectivity. Show participants that you are not personally attached to a concept or design, allowing them to provide feedback without hurting your feelings.
· Build alliances with participants. Show participants that you are on their side and invested in developing a product that will be both useful and enjoyable for them to use.
· Use anonymous testing. Filter usability testing through a subsidiary or a contracted agency to avoid any brand bias entirely.
· Prioritize objective data. What users actually do is more important that what they say they do.
If you take these lessons to heart and apply them appropriately, they can be enormously helpful in making some of your more difficult interactions with users much more productive.
This article was first published as a column on UXmatters.com
How the Legacy of Westward Expansion Continues to Inspire Small Business Owners
Read more...Top three lessons learned: Stay Humble; Don't Quit; Show Up
Read more...