Refining Design
Refining Design
Design
for Business
MICHAEL KRYPEL
Refining
Design
for Business
Michael Krypel
Refining Design for Business:
Using analytics, marketing, and technology to inform customer-centric design
Michael Krypel
This Adobe Press book is published by Peachpit.
For information on Adobe Press books and other products, contact:
Peachpit
www.peachpit.com
For the latest on Adobe Press books, go to www.adobepress.com
To report errors, please send a note to [email protected]
Peachpit is a division of Pearson Education.
Copyright © 2014 by Michael Krypel
Project Editor: Valerie Witte
Production Editor: Becky Winter
Developmental and Copyeditor: Anne Marie Walker
Proofreader: Liz Welch
Composition: Danielle Foster
Indexer: James Minkin
Cover and Interior Design: Mimi Heft
Cover and Interior Illustrations: Paul Mavrides
Notice of Rights
All rights reserved. No part of this book may be reproduced or transmitted in any form by any means,
electronic, mechanical, photocopying, recording, or otherwise, without the prior written permission of
the publisher. For information on getting permission for reprints and excerpts, contact permissions@
peachpit.com.
Notice of Liability
The information in this book is distributed on an “As Is” basis, without warranty. While every precaution
has been taken in the preparation of the book, neither the author nor Peachpit shall have any liability
to any person or entity with respect to any loss or damage caused or alleged to be caused directly
or indirectly by the instructions contained in this book or by the computer software and hardware
products described in it.
Any views or opinions presented in the interviews in this book are solely those of the author and
interviewee and do not necessarily represent those of the companies included in this book.
Trademarks
Many of the designations used by manufacturers and sellers to distinguish their products are claimed
as trademarks. Where those designations appear in this book, and Peachpit was aware of a trademark
claim, the designations appear as requested by the owner of the trademark. All other product names
and services identified throughout this book are used in editorial fashion only and for the benefit of
such companies with no intention of infringement of the trademark. No such use, or the use of any
trade name, is intended to convey endorsement or other affiliation with this book.
ISBN-13: 978-0-321-94088-9
ISBN-10: 0-321-94088-1
987654321
Printed and bound in the United States of America
To Darren Johnson, Jennifer Sun, Kripa Nerlikar,
Ramona Meyer-Piagentini, and Stephen Ratpojanakul:
Thank you for your guidance and enthusiasm over the last eight years,
and for your tremendous contributions to building the optimization field.
Acknowledgments
I will be forever grateful to all the wonderful people from Adobe, Omniture, and
Offermatica for allowing me to be part of their exceptional teams, including Aaron
Graham, Adam Fayne, Adam Justis, Adam Wood, Alan Gurock, Amy Lam, Andre
Prevot, Ann Chen, Aseem Chandra, Barbara Dawson, Bianca Slade, Bill Ozinga, Bill
Peabody, Brad Kay, Brent Dykes, Brian Hawkins, Brian Ivanovick, Brittany Chandler,
Cameron Barnes, Christi Terjesen, Christine Yarrow, Christy Armstrong, Colin Lewis,
Colleen Nagle, Daniel Hopkins, Daniel Wright, Darren Johnson, Darrin Poole, David
Baker, David Hoye, David Humphrey, Debra Adams, Derek Bryce, Don Abshire, Doni
Lillis, Doug Mumford, Drew Burns, Drew Phillips, Eddie Ramirez, Ehren Hozumi,
Garrett Ilg, Gene Holcombe, Georgia Frailey, Heather Razukas, Hiro Awanohara, Jacob
Favre, James Roche, Jameson O’Guinn, Jamie Stone, Jason Haddock, Jason Hickey,
Jason Holmes, Jeff Fuhriman, Jennifer Sun, Jim Sink, John Kucera, John Mosbaugh,
Jonathan Mendez, Jonathan Weissbard, Justin Patrick, Ka Swan Teo, Kaela Cusack,
Katie Cozby, Kellie Snyder, Kendra Jenkins, Kevin Lindsay, Kevin Scally, Kevin Smith,
Kripa Nerlikar, Kyle Ellis, Kyle Johnson, Lacey Bell, Lambert Walsh, Lance Jones, Lily
Chiu, Liz Quinn, Mandeep Sidhu, Mark Boothe, Matthew Lowden, Matthew Roche,
Matthew Smedley, Matthew Thurber, Michael Curry, Michael Evensen, Mikel Chertudi,
Neha Gupta, Norman Dabney, Paige Burton, Peter Callahan, Rachel Elkington, Rameen
Taheri, Ramona Piagentini, Rand Blair, Reuben Poon, Richard Oto, Rob Cantave, Ron
Breger, Rotem Ben-Israel, Russell Lewis, Sachie Reichbach, Sarah Ferrick, Serge St. Felix,
Shoaib Alam, Stephen Frieder, Stephen Ratpojanakul, Steve O’Neil, Thejas Varier, Tiffany
Olejnik, Tom Ratcliff, Tony DiLoreto, Tracy Harvey, Vincent Cortese, Vladimir Sanchez
Olivares, Wallace Rutherford, Whitney Littlewood, and Zoltan Liu.
I am extremely thankful to everyone who agreed to be interviewed or who helped
connect me with people to interview, including Justin Ramers from Active Network;
Andrew Switzer from Ally Bank; Tom Lau from American Express®; Kevin Gallagher
and Thomas Gage from AutoTrader; Adam Crutchfield from Axcess Financial; Brandon
Proctor and Justin Bergson from Build.com; Chris Kahle from Caesars; Joanne Pugh and
Stephanie Paulson from Central Restaurant Products; Kyle Power from CHG Healthcare;
Christine Cox, John Williamson, Jonathan Stein, and Ruth Zinder from Comcast; Ed Wu,
Emily Campbell, Étienne Cox, Isabelle Mouli-Castillo, Joel Wright, Lester Saucier, Nazli
Yuzak, and Will Close from Dell; Sandy Martin from Dollar Thrifty; Zimran Ahmed
from Electronic Arts; Nate Bolt and Slater Tow from Facebook; Thomas Jankowski from
FlightNetwork; Karina van Schaardenburg and Simon Favreau-Lessard from Foursquare;
Jerome Doran, Jon Wiley, and Krisztina Radosavljevic-Szilagyi from Google; Linda Tai
from Hightail; Phil Corbett from IBM; Ajit Sivadasan, Ashish Braganza, Lewis Broadnax,
and Siping Roussin from Lenovo; Amy Parnell and Lea Ann Hutter from LinkedIn; Pete
Maher from the Luma Institute; Kenyon Rogers from Marriott; Peter Davio and Steven
Webster from Microsoft; Kyle Rush from Obama for America; Blake Brossman, Natalie
Bonacasa, and Ujjwal Dhoot from PetCareRx; Amit Gupta from Photojojo; John Pace
from RealNetworks; Chris Krohn, Phil Volini, and Sarah Nelson from Restaurant.com;
Matt Curtis and Roger Scholl from Saks Fifth Avenue; Ryan Pizzuto from T-Mobile;
Eileen Krill and Mary Bannon from The Washington Post; and Matthew Pereira and Rob
Blakeley from WebMD.
Thank you to Kelly Patterson for her fabulous editing, to Rosemary Knes for her careful
proofing, and to Jeff Patterson for introducing me to them. Thank you also to everyone
at the Adobe Press and Pearson Education for their hard work, including Anne Marie
Walker, Becky Winter, Damon Hampson, Danielle Foster, Jim LeValley, Liz Welch, Mimi
Heft, Ted Waitt, Valerie Witte, Victor Gavenda, and Vidya Subramanian Ravi.
Thank you to my mom, Merrill Janover, whose curiosity, creativity, and love of learning
are an inspiration, and to my brothers, David Krypel and Brian Krypel, for their
encouragement and comfort. Thank you to Alan Schorn, Bob Klein, Chayym Zeldis,
David Roth, Elizabeth Metz, Jarek Koniusz, Jeffrey Gilden, Jerry Denzer, John Cave, John
Zannos, Joseph Rutkowski, Nina Zeldis, Philip Sorgen, Robert Abrams, and all my other
wonderful teachers for their guidance, confidence, and inspiration. Thank you also to
Daniel Schweitzer, Bonnie Walters, Greg Lowder, and Scott Epstein for their feedback
on drafts of the book and for their amazing support.
Contents at a Glance
Part 1 Creating Engaging Customer Experiences 1
Index . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 279
Contents
Introduction. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xiv
The Age of Optimization. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xiv
How This Book Is Structured . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xv
PetCareRx. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 251
Visual Examples. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 252
Saks Fifth Avenue. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 258
Visual Examples. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 258
T-Mobile . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 262
Visual Examples. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 263
The Washington Post. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 268
Visual Examples. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 271
Index . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 279
Introduction
The Age of Optimization
This book aims to help businesspeople apply a comprehensive and powerful new way of
building business and customer value online, using a proven methodology that works
for any size of business, in any industry. Highly interdisciplinary in nature, the Iterative
Optimization Methodology will be especially useful to:
CC Business executives. Aiming to meet online customer needs through a rigorous
research and evaluation process that also fosters creativity, the optimization method
is appealing to business leaders of all ranks, including C-Level executives.
CC Marketers, designers, and information architects. Great marketers and designers
were among the first professional groups to embrace the customer-centric design
techniques described in this book.
CC Data analysts. Data analysts are essential to the success of almost every step of the
Iterative Optimization Methodology. They provide users throughout the business
with access to crucial data and help them to act on analytics-based insights.
CC Engineers and solutions architects. Engineers and solutions architects are often
the unsung heroes of online business. They spend much of their time setting up sys-
tems to deploy designs without knowing whether what they’re working on is helping
the company and its customers. The optimization method allows them to find out
whether or not their efforts have been fruitful.
CC Students, including MBA candidates. Students can learn more about an exciting
new field—one in which there is high demand for people who excel at creating great
wireframes, running optimization programs, and other skills described in this book.
How This Book Is Structured
This book is divided into three parts:
Part 1: Creating Engaging Customer Experiences. This part discusses the level of
importance design now plays in the business world, challenges the standard design
process implemented by most companies, and introduces the Iterative Optimization
Methodology by showing how design testing can lead to more creative and
impactful designs.
Part 2: The Iterative Optimization Methodology. Using real-life examples, this part
describes how to drive business and customer value in step-by-step detail. It shows
how companies can integrate qualitative and quantitative customer research, prioritize
website sections and design ideas for testing, experiment with new designs under real
market conditions, and scale optimization techniques across their organization.
Part 3: Visual Business Cases. In this part, business leaders from 20 companies,
including Google, Facebook, Comcast, Marriott, and American Express®, share examples
of their favorite design tests and discuss practical approaches for using data to inform
customer-centric design.
This page intentionally left blank
This page intentionally left blank
Chapter 4
Qualitative Research
a financial services company’s customer goals may include opening or closing a deposit
account, sending and depositing checks from a mobile device, getting a replacement
credit card while traveling abroad, and so on.
If a businessperson is struggling with this task, it may help to ask the question: If cus-
tomers could come to the business looking for help with only one thing, what would it be?
It’s much better to create a top-notch experience that can satisfy one important goal
shared by many customers than to create several mediocre experiences in an attempt
to address many different objectives.
CC Is it clear how to find related or popular content? Does it interfere with the primary
content experience?
On conversion:
CC Are there barriers to easily checking out?
CC Are prices, taxes, shipping, returns, and security protections apparent? Are any of
them barriers to checking out?
CC Are there barriers to easily viewing additional content?
On the competition:
CC Which competitors do customers go to, and how do the preceding questions apply
to these competitors?
CC Do competitors offer a product or service that would complement the business’s
offerings, such as accessories for a product the business sells? If so, should the busi-
ness offer it as well?
Note that some of the preceding questions are based on quantitative research; however, it
is not essential to have performed quantitative studies before tackling qualitative research.
Most businesses already have a basic understanding of their data, and the iterative
method means new insights, from both kinds of studies, will be added to the mix through-
out the process, allowing team members to refine and add to their list of questions.
The next step in qualitative research is to try to answer these questions through first-
hand experience, observing customers, and other methods, as described in the follow-
ing sections.
CC Customers scan pages quickly. They skip over details until they’ve found what
they’re looking for. For example, they may click through a business’s homepage and
several category and product pages rapidly, navigating with the help of images or
keywords until they find a product they’re interested in; only then are they likely to
read details such as the price, feature list, reviews, and shipping policy. They are very
good at ignoring anything that looks like advertising, a tendency that is sometimes
called banner blindness.
CC Customers don’t act like professionals. They don’t think about whether the navi-
gation is consistent, which page they’re on within the site, how individual page ele-
ments are working, what the business is trying to get them to do next, and so forth.
1
3
CC Do customers also try to buy before selecting a size and get confused when no
message reminds them to pick a size or offers to help them do so?
Researchers seek to validate these questions by observing customers interacting with
the site and by digging into the site analytics. Team members then create hypotheses,
or proposed answers to these questions, and attempt to vet the hypotheses through test
ideas, which are added to the Optimization Roadmap.
Customers are recruited in a variety of ways; although some volunteer, they’re more
often compensated through some form of payment, such as a check or gift card. Even
though customers know they’re being observed, the goal of this type of study is to create
as natural a setting as possible, so it may take place not only in research facilities, but
also in cafés, customer homes, and other venues. Sessions are conducted one-on-one,
with additional researchers observing behind one-way glass or through a video feed.
Depending on the business, sessions can last anywhere from 10 to 60 minutes. High-
level trends usually start to emerge after watching four or five customers, but to be on
the safe side, researchers often observe eight to ten customers during each study. The
frequency of observational studies varies according to the company, but most busi-
nesses should conduct them at least quarterly. Many sizable and successful companies
conduct this type of research whenever they have pressing unanswered questions about
their business, which could arise several times per month.
The following list outlines the basic framework for this type of research, as well as some
best practices for observing customers online:
Recruitment
CC The first few times a business conducts this type of research, it may be helpful to go
through a recruitment agency and use an expert moderator to conduct the sessions.
Later, the business can recruit by placing an ad online on a site like Craigslist or
work with existing customers contacted through an internal email list.
CC It is useful to recruit customers based on an experience they recently had or an
upcoming experience that the business can help with. For example, a hospitality
business might recruit customers who have recently booked a hotel room online or
who need to book a room for an upcoming trip.
CC If possible, the customers should not know the name of the business, because this
information might influence their behavior. For example, recruiters can tell them the
study is about online user behavior related to the general category of tasks the busi-
ness is interested in observing (purchasing clothing, reading the news, searching for
general information, and so on).
Setting scenarios
CC When customers arrive, the researcher welcomes them and asks questions that
prompt them to talk about a wide range of recent online experiences, such as “What
have you recently shopped for online?” or “Have you planned any trips recently?”
CC The researcher provides customers with a computer or mobile device that allows
them to browse the web while recording their online session and facial expressions.
(Of course, customers are informed in advance that their session will be recorded.)
74 Refining Design for Business
CC The researcher invites the customers to go through a scenario they had mentioned
earlier that pertains to the business (e.g., booking a hotel room for an upcom-
ing stay). Customers are asked to speak aloud while using the computer to let the
researcher know what they’re thinking.
Observations
CC The researcher then sits back, pays close attention, and tries not to say anything. The
key is to leave customers alone while carefully watching how they try to reach their
goal—and what obstacles they face along the way—and paying close attention to
shifts in body language, facial expressions, and so on.
CC The researcher refrains from asking customers what they think about specific web-
sites or designs. For example, the researcher does not ask whether certain sites are
easy or helpful. Instead, the researcher focuses on what customers show through
their actions; the more customers are prompted, the more likely they’ll be to say or
do something based on what they think the researcher wants to hear. If customers
get confused and ask what to do, the researcher simply says something like, “Please
do whatever you would normally do.”
On prompting the customer
CC Some customers have a tendency to remain silent throughout the session. If this
is the case, the researcher will occasionally prompt them with a kind reminder to
vocalize their thoughts.
CC If, toward the end of the session, customers haven’t used the online business, the
researcher may prompt them by saying something like, “I’m interested in seeing you
do the same thing with a few specific websites.” The researcher will give the custom-
ers a list of sites that includes the business being studied as well as competitors not
yet visited.
CC The researcher should wait until the end of the session to ask any specific questions
about the customer’s behavior, and even then, they should avoid asking for any opin-
ions. For example, a researcher can ask customers to explain what they were think-
ing at a specific moment rather than asking if they thought the experience was good
or not. When listening to their answers, the researcher should be on the lookout for
a mismatch between what customers say and their observed actions. People tend to
jump around online from website to website very quickly and usually have difficulty
remembering all the actions they took online, let alone why they took them.
CC At the end of the session, the researcher should thank the customers for their time
and pay them the agreed-upon fee.
After the session, the researcher reviews the video to capture any findings that provide
insights into the list of qualitative research questions. These insights, in turn, may lead
to the formulation of new questions for further exploration in the iterative process.
Chapter 4: Qualitative Research 75
Ethnographic Studies
Ethnographic studies are a special form of observational research that take place where
the customer would normally interact with the company, such as at their home or
office, or at the location of the business itself, so that they are more likely to behave
the way they normally would. Just like observational customer research, these sessions
are most effective when they are unguided and researchers focus on active watching
and listening.
Kenyon Rogers, Director of Digital Experiments for Marriott International, said that
his business regularly conducts ethnographic studies at select hotel properties. For
example, in 2013 the company piloted a program that allows guests to “use their smart-
phones to check into the property and open their room door without needing to inter-
act with a Marriott team member.” Rogers added that very soon, “guests will be able to
control their entire experience, including ordering room service, extending their stay,
ordering transportation, and booking meeting rooms through their smartphones.”
76 Refining Design for Business
Look for the Post-its:
Pete Maher on Contextual Inquiry
Pete Maher is the Co-Founder and Chief Operating great intentions—trying to protect a user’s security—
Officer of the LUMA Institute, an educational com- enact requirements that ultimately make it too diffi-
pany that equips people to accelerate innovation. He cult for people to achieve their goals.
is also the co-author of Innovating for People: Hand- So, when conducting contextual inquiry research,
book of Human-Centered Design Methods, which has we’ll advise teams to look for the Post-its or duct
been adopted by the U.S. Office of Personnel Man- tape: What are the clever workarounds users come
agement’s Innovation Lab as core training material up with to make it possible for them to use the
for federal government employees. In this interview, product? It’s only when we ask people to show us
he shares his thoughts on the importance of primary how they actually use something that we’re able to
observational research. uncover really ripe opportunities for innovation. And,
Your book presents the “contextual inquiry” most important, close observation uncovers those
method, which is an ethnographic approach. Can gaps between what people do and what people say
you describe it here? they do.
I’ve got a deep affinity for contextual inquiry research Would you tell us more about the gaps between
because it’s a great way to get to insights that can ulti- what people do and what they say?
mately uncover opportunities for innovation. Contex- Margaret Mead, the famed anthropologist, really
tual inquiry takes place where the participants would said it best: “What people say, what people do, and
normally conduct their tasks. The interviewer asks what they say they do are entirely different things.”
the participants to go about their tasks in a normal Anybody who has spent a lot of time doing research
way, observes their actions in an unobtrusive man- knows this to be the case. It’s not because people are
ner, interjects questions at opportune moments, and necessarily trying to lie or mislead. It turns out that
records the sessions for later analysis. we, as people, are simply not wired to be able to really
Would you share an example? articulate why we did that thing we just did.
When we don’t deeply understand the context of For example, if you ask somebody to describe to
the user, we sometimes assume that we’re deliver- you in detail how they completed a purchase on a
ing against their needs. For example, it turns out website, you could imagine them describing the steps
digital products are incredibly difficult to use. From that they took and all the different things that they
my days of doing research in the financial services did along the way. But if you were to actually watch
industry, when we would go into people’s homes to that scenario play out, you would see something very
understand how they engage with financial products, different. So contextual inquiry research, especially in
we would commonly find things like Post-its stuck the digital context, allows us to observe how some-
to monitors. It’s no mystery that some of those Post- body’s moving through a digital experience to really
its contain usernames and passwords, or software know what they just did and to try to understand why.
instructions, because a lot of times companies with
Chapter 4: Qualitative Research 77
Surveys
Businesses place surveys on their actual site or app, or email them to customers. Large
surveys can have statistically significant sample sizes, but the researcher must be on
the lookout for data not representative of the larger customer base due to self-selection
bias. For example, not every customer wants to fill out a survey, and those who do may
have the strongest positive or negative opinions.
As with all forms of qualitative research, the more open-ended the survey, the better.
Surveys that ask customers about specific design decisions place the customer in the
awkward position of being asked to provide advice outside of their area of expertise.
Rather, understanding whether customers found their overall experience to be positive
or negative and providing an open-form field for customers to write about any aspect of
the experience they choose can often lead to actionable data.
Customer Panels
Customer panels are a subset of surveys: They typically consist of thousands of partici-
pants who have elected to give survey feedback on a regular basis. Panels may be run
by a company’s research team or by consulting firms on behalf of many businesses.
Like surveys, customer panels can provide statistically significant sample sizes, but
it’s important to understand the segment of participants being queried. For example,
although panels consisting entirely of self-selected users of one business might not
be representative of the entire population, they can give the business insight into the
behavior and opinions of their more loyal customers.
Eileen Krill, research manager at The Washington Post, oversees customer research for all
of the business’s print and digital brands. A panel of about 7,000 customers is included
in the many types of qualitative and quantitative customer research she oversees. Krill
will ask the panel “a wide range of closed and open-ended questions, depending on the
objectives of the survey,” including “satisfaction rating questions.” She points out that
“open-ended feedback is generally far more meaningful and actionable than the score
itself ” because it can help to “reveal the reasons behind the scores.”
One question she often asks the panel is “whether a new product or feature will
improve the customer’s impression of The Washington Post brand.” Although in most
cases participants say such additions would have no impact, Krill still asks the question
in case it provides an important insight. For example, she said, “We once tested the idea
of starting an online dating service for Post readers, and that got a lot of people saying
they would have a lower opinion of the company.” Krill noted that “I think that research
was one of the key things that may have killed the idea.”
78 Refining Design for Business
Diary Studies
Diary studies consist of a business asking customers to take notes and regularly send
them back to the company, usually over an extended period of time. These studies may
ask participants to take notes only on a specific topic area, like their regular interac-
tions with a new site or app, or they may be more general and simply ask customers
how they spent their day.
Google Search Lead Designer Jon Wiley shared an example of an ongoing diary study being
run by the company. Through a mobile app, the study regularly asks participants to reply to
the question: “What is the last bit of information you needed to know?” The information can
be related to any aspect of their life, not only the material they were looking for online. Wiley
and his team then “look at the needs that people have in their lives” and try to answer the
question, “Is there a way that we, as Google, can find a solution for them?”
Card-sorting
A technique used to gain insight into how to organize content, such as ordering and
grouping similar navigational links, card-sorting directly involves customers in the
design process. Customers are provided with a stack of cards containing information
and asked to perform a task, such as organizing them into logical categories. Although
it is typically performed with index cards or sticky notes, card-sorting can also be per-
formed online, and there is no limit to the number of participants.
Caution should be observed, because this technique is more guided than the aforemen-
tioned qualitative methods and may place customers in the position of being asked to
act like a professional designer. However, if performed with minimal prompting, card-
sorting can provide designers with a rare opportunity to gain insight into how custom-
ers think about information architecture and content hierarchies.
Nate Bolt, design research manager at Facebook, said the company used cart-sorting as
one of many qualitative research methods to inform an ongoing redesign of users’ Face-
book News Feeds. After recruiting users, the Facebook Design Research team printed
out each user’s feed, up to the minute, on paper. Then, he explained, “they would cut out
their feed stories, place them on a table, and group them into what they considered to
be like-minded categories. That helped us reprioritize the ways that stories are grouped
and organized within people’s News Feeds in a real, human way. Obviously, this went
hand in hand with all other data, including the system (analytics) data.”
Feedback Forms
Feedback forms on websites, or email addresses for user feedback, can be a good source
of information. Users who provide open-ended feedback in this way often have timely
insights into customer frustrations, as well as positive comments on great experiences
with the website or the business’s team members.
Chapter 4: Qualitative Research 79
John Williamson, the Senior Vice President and General Manager of Comcast.com,
explained that “One of the best sources for information I have is the ‘Website Feedback’
link” that appears on the bottom of each page on the site. Williamson noted, “I was in
banking in the mid-1990s, and I remember how excited we would get if we received a
letter from a customer. We really would.” Every day Comcast receives “hundreds of let-
ters” from customers through the feedback link, and because they are a “huge benefit”
that helps Williamson and his team understand their customers and their business, he
added, “I read every one of them.”
Default
Figure 4.2 The default
page with three elements
highlighted for testing.
Chapter 4: Qualitative Research 81
The qualitative study findings related to these three elements and the steps researchers
took in response are summarized here.
Finding. Not all customers interacted with the first step of the process to learn more
about upgrading to an HD or HD DVR receiver.
CC Hypothesis. Customers do not notice the option to upgrade a receiver.
CC Validate with analytics. Does the link to upgrade the receiver get very few clicks?
CC Test idea. Find out whether more customers would upgrade if more information
about the options to switch to an HD or HD DVR receiver appeared on the page
without requiring additional clicks.
Finding. All customers understood how to select and deselect premium channels, so this
function seems user friendly. However, the website highlighted specific shows carried on
these channels rather than the actual channels, and several customers commented that
they hadn’t heard of these shows. Example: “I never heard of Boss. I never saw Contagion.”
CC Hypothesis. Highlighting channel names rather than specific shows will clarify the
process of adding premium channels.
CC Validate with analytics. Do premium channel selections vary based on the
featured shows?
CC Test idea. To appeal to a wider audience, feature prominent channel names
(HBO, Showtime, etc.) rather than specific shows.
Finding. Several customers commented that they didn’t know what was included in
the add-ons. Example: “What exactly is included in the Sports Package?”
CC Hypothesis. Customers don’t realize they can click on the toggle switches for more
information. Because the switches are arranged in rows, followed by messaging for
each add-on, customers may be mistaking them for bullet points.
CC Validate with analytics. Do customers rarely click on these toggle switches?
CC Test Idea. Replace individual offers with a one-line message reading “More custom-
ization options” to help customers understand that the image to the left is a toggle
switch that, when clicked, will reveal the details of the offers.
As noted earlier, researchers targeted receiver selection, premium channel logos, and add-
on content for design testing. An alternate version of each element was built and placed
in a multivariable test (MVT). A multivariable, or multivariate, test is a special type of A/B
test in which several page elements are tested across multiple recipes. Rather than simply
testing two versions of the page—one with the original elements and one with the new
ones—multiple versions of the page are tested, each including a different combination of
the elements. In addition to determining which version of the page has the highest perfor-
mance, this type of test measures the success of each element.
82 Refining Design for Business
The default and alternate versions of each element are shown in Figure 4.3. This figure
also includes which version of each element won and which out of all three elements
most influenced the performance of the page:
CC Receiver section. The default version has a blue link to upgrade. The alternate ver-
sion adds a prominent message to upgrade to an HD or HD DVR receiver.
CC Premium channel logos. The default version features a specific show for each pre-
mium channel. The alternate version features channel logos instead of shows.
CC Other add-ons. The default version places content behind four toggle switches. The
alternate version places content behind a single toggle switch, which, when clicked,
expands to show all up-sell content at once.
The business goal was to increase the purchase conversion rate as well as the overall
revenue per visitor (RPV). As Figure 4.3 shows, the winning version of the receiver
selection element was not the new design but the original; however, the new versions
emerged as winners for both the premium channel element and the add-ons further
down on the page.
When Comcast integrated all three winning designs into the page and tested this
revamped page against the original one, it found the changes drove a 4.6 percent lift
in the purchase conversion rate and a 5.6 percent lift in RPV. The default and winning
versions of the page are pictured in Figure 4.4.
Because the impact of each element had been isolated, Comcast was also able to mea-
sure how much each one influenced the performance of the final version of the page:
The premium channel element was the most influential, accounting for over 60 percent
of the increase in RPV, indicating that, of the three elements tested, it was the original
version of this feature that had presented the largest obstacle to customers completing
their goals.
The main recommendation was to push the winning version of the page to all visitors.
Additional proposals included running follow-up design tests of each variable to further
simplify the user experience, starting with the premium channel element, because it
had emerged as the most important one overall.
Chapter 4: Qualitative Research 83
Winner
Element #1
Element #2
Default: Specific show for each channel Alternate: Channel name without shows listed
Winner
Element #3
Alternate:
All content hidden
Default: Three collapsed choices
behind one link;
when clicked,
all content expands
Figure 4.3 The default and alternate versions of each of the
three elements tested. This figure includes the winning version
of each element, as well as the most influential of the three
winning elements in terms of overall impact on the page.
84 Refining Design for Business
Default Winner
Ally Bank “We help our customers to achieve their savings goals.” (Andrew Switzer)
Comcast “Comcast helps our customers to enrich their lives through entertainment
and technology.” (John Williamson)
Foursquare “We help our customers to make the most out of where they are.” (Simon
Favreau-Lessard)
IBM “IBM helps enable our customers to do their jobs better.” (Phil Corbett)
LinkedIn “We help our customers to connect with new opportunities.” (Amy Parnell)
PetCareRx “PetCareRx helps to bring health and happiness to pet owners and their
pets.” (Blake Brossman)
Saks Fifth “Our strategy is to inspire customer confidence and style with every Saks
Avenue shopping experience.” (Roger Scholl and Matt Curtis)
WebMD “We help our customers to find health information.” (Rob Blakeley)
Up Next
This chapter provided an overview of qualitative research techniques, which offer
insights into customers’ goals and the challenges they face when trying to accomplish
those goals. These insights lead to test ideas that will be added to the Optimization
Roadmap, as well as inform the overall direction for a company.
The next chapter outlines quantitative research methods that companies use to verify
qualitative findings, as well as to identify which site areas need improvement the most
and what test ideas to prioritize as part of assembling the Optimization Roadmap.
References
1. “Marketing Myopia,” Harvard Business Review; Theodore Levitt (1960).
This page intentionally left blank
INDEX
A promoting social links at log-off portal,
148–149
A/B testing. See also testing
American Express, 150–152
Adobe’s use of, 143–145
business and customers’ goals for, 150
Ally Bank’s use of, 147–149
designing multiple product displays,
best practices for, 114 151–152
Caesars’ booking module, 154–156 offering customer incentives, 151
Comcast’s, 160–161, 162 optimization program for, 150
defined, 29 testing impact on business strategies,
Dell’s use of, 168–169, 170–171 150–151
email design for Dollar Thrifty, 173–176 analysis paralysis, 88
Foursquare’s, 191, 193, 194–195 AOV (average order value), 92, 166, 267
Hightail’s homepage test using, 211 assembling wireframes, 119–120
LinkedIn’s, 223–225 auto-focusing feature, 246
Marriott International, 227–230 auto-optimizing tests, 109, 116–117
multivariable tests in, 81 autocomplete feature, 194–195, 209
Obama for America, 231, 232, 233–234, average order value (AOV), 92, 166, 267
236–248, 249–250 Axelrod, David, 234
PetCareRx messaging and banner, 252–257
product storage limits, 211–212
redesigned customer loyalty pages, 157 B
segmentation and, 235 banners
specifying in methodology, 109 A/B testing of size, messaging, and images
for, 253–255
T-Mobile phone payment options, 265–267
ignored by customers, 69
Washington Post’s, 271–277
best practices
address fields, 216
customer service, 121–122
Adobe, 143–145
Foursquare’s, 193
business and customers’ goals for, 143
Hightail’s, 211, 212
use of A/B testing, 143–145
Saks Fifth Avenue’s, 258
advertising banners, 69
Bezos, Jeff, 11, 268, 271
Ahmed, Zimran, 177–184
Blakeley, Rob, 66, 86
algorithm for News Feed displays, 187–188
blue-sky approach, 120
Ally Bank, 146–149
Bolt, Nate, 34, 35, 78, 185–190
balancing creativity with research, 146
borders of wireframes, 119
business and customers’ goals for, 146
bounce rate, 92
homepage testing by, 147–148
280 Refining Design for Business
fundraising (continued )
designing donation pages, 232–233
H
headlines, 121, 215–216
form field background color tests, 246, 247
hero offers and images, 174–176, 230
reducing credit card errors, 237–238,
Hertz Global Holdings, Inc., 172
240–241
heuristic reviews. See user experience
reinforcing email content on landing page,
244, 245 Hightail, 210–212
removing dollar signs, 246 business and customers’ goals for, 210
removing donate buttons from site, 244 homepage testing by, 211
strategies for, 231–232 success metrics for, 210–211
testing video for, 242, 243 testing product storage limits, 211–212
funnel report, 98–99 hiring team members, 137–140
homepage
Ally Bank’s testing of, 147–148
G Dell’s testing of, 170–171
games, 177–184 designing new versions of, 6–7
geo-targeting, 95 Hightail’s testing of, 211
global navigation testing, 161, 162 Marriott’s testing for, 227–229
global pre-login page, 215–216 PetCareRx message testing for, 252–253
Goff, Teddy, 236 prioritized test list for, 51
Going Out Guide redesign test, 276–277 hypotheses, 108, 109
Google, 196–209
approach to innovation, 196–198
collecting experiential data, 196, 197–198 I
developing query information sets, 204–207 IBM, 213–219
Instant, 209 blog about optimization results, 214
Knowledge Graph, 202, 204, 205, 207 business and customers’ goals for, 213
qualitative research used by, 198–199, 206 MVT testing by, 215–219
research on color, 35, 200–201 optimization program of, 213–214
Search, 34, 201–202 success metrics for, 214
search box modifications, 207–209 Illustrator product page, 143–145
sharing results via Google Documents, 236 images. See also case studies
success metrics tracked by, 201–203 A/B testing of links to photo tour, 230
teams within, 203–204 A/B testing of products and, 151–152
Voice Search, 200, 202 PetCareRx A/B testing of banner, 253–255
groupthink, 75 testing alternate, 21
guests implementing test design, 123
logging in with IBM ID, 216–219 incident optimization, 130, 131
offering products for subscribers and, Innovating for People (Maher), 76
177–184 innovation
guided editing flow, 224–225 creating new user experiences, 199–200
encouraging with design testing, 30
Index 285
optimization lead, 135–136, 137, 139 form field background color, 246, 247
prioritizing test areas, 102–103 Google’s use of, 35
regular meetings for, 127 hero offers and images, 174–176, 230
role in Facebook innovation, 186 hypothesis development for, 11–12
roles and responsibilities for, 133–135 IBM link for CTA, 218
senior hires for, 138, 139 keeping track of Caesars’, 154
setting testing priorities, 104–105 multiple design versions, 6
starting and scaling new programs, 160, new products and services online, 9–10
173, 223 numbering tests in, 106
technical lead, 136, 137 platform speed, 236–237
technical lead, 136, 137 priorities for, 47–48, 102–103
test methodologies, 108–117. See also purpose of, 113
A/B testing; Iterative Optimization QA process during, 124–125
Methodology; MVTs
quantitative questions shaping, 88–89
components of, 108–109
roadmap for PetCareRx, 47–51
duration of testing, 109
Saks Fifth Avenue, 258, 259–261
evaluating effect of traffic levels on,
selected web browsers, 125
110–113
site’s query refinement, 195
evaluating Type I/Type II errors, 109–110
Social Security messaging, 161–163
Facebook’s research, 186–187
types of, 109
insights and impact method, 89–90
using data in, 29–30
number of recipes in, 109–113
Washington Post layouts, 11–20
purpose and effectiveness of testing, 113
Thomas, Scott, 235
used in qualitative research, 75–79
toolbox options, 11–19
test objectives, 108
TOS (time on site), 93
test types
touch devices, 120–121, 200
A/B testing, 29, 81, 109, 114
traffic
about, 114
estimating minimum visitors per design,
auto-optimizing tests, 109, 116–117
110–112
multi-variable testing, 114–115
methodologies for low volume site, 113
targeting and personalization tests, 116
trust
testing. See also test methodologies
building customer’s, 121
autocomplete feature, 194–195
testing messaging for, 252–253
benefits of, 31
two-sided Z-test, 112
brainstorming ideas for, 103–104, 105
Type I/Type II errors, 109–110
Comcast’s use of, 79–84, 159
courage required for, 262
creativity in iterative, 32 U
defined, 29 usability testing shortcomings, 75
designing user, 234 user experience
duration of, 109, 234 business responses to, 72
fears about, 25, 31–33, 35 collecting data about, 196, 197–198
292 Refining Design for Business