Technically Wrong

Home > Other > Technically Wrong > Page 4
Technically Wrong Page 4

by Sara Wachter-Boettcher


  Our digital products can do this too. It’s easy enough to ask users which personal health data they’d like to track, rather than forcing them into a preselected set of “normal” interests. It’s easy enough to make form fields accept longer character counts, rather than cutting off people’s names (more of that in the next chapter). But too often, tech doesn’t find these kinds of cheap solutions—the digital equivalents of adjustable seats—because the people behind our digital products are so sure they know what normal people are like that they’re simply not looking for them.

  Eric Meyer and I wrote about this in Design for Real Life, calling on designers to let go of their narrow ideas about “normal people,” and instead focus on those people whose identities and situations are often ignored: people transitioning their gender presentation, or dealing with unexpected unemployment, or managing a chronic illness, or trying to leave a violent ex. We didn’t call these people’s identities and scenarios “edge cases,” though. We called them stress cases.

  It’s a subtle shift, but we believe it’s an important one. When designers call someone an edge case, they imply that they’re not important enough to care about—that they’re outside the bounds of concern. In contrast, a stress case shows designers how strong their work is—and where it breaks down.

  That’s what one design team at National Public Radio is doing. During the process of redesigning the NPR News mobile app, senior designer Libby Bawcombe wanted to know how to make design decisions that were more inclusive to a diverse audience, and more compassionate to that audience’s needs. So she led a session to identify stress cases for news consumers, and used the information she gathered to guide the team’s design decisions. The result was dozens of stress cases around many different scenarios, such as:

  • A person feeling anxious because a family member is in the location where breaking news is occurring

  • An English language learner who is struggling to understand a critical news alert

  • A worker who can only access news from their phone while on a break from work

  • A person who feels upset because a story triggered their memory of a traumatic event13

  None of these scenarios are what we think of as “average.” Yet each of these is entirely normal: they’re scenarios and feelings that are perfectly understandable, and that any of us could find ourselves experiencing.

  That’s not to say NPR plans to customize its design for every single situation. Instead, says Bawcombe, it’s an exercise in seeing the problem space differently:

  Identifying stress cases helps us see the spectrum of varied and imperfect ways humans encounter our products, especially taking into consideration moments of stress, anxiety and urgency. Stress cases help us design for real user journeys that fall outside of our ideal circumstances and assumptions.14

  Putting this new lens on the product helped the design team see all kinds of decisions differently. For example, the old NPR News app displayed all stories the same way: just a headline and a tiny thumbnail image. This design is great for skimming—something many users rely on—but it’s not always great for knowing what you’re skimming. Many stories are nuanced, requiring a bit more context to understand what they’re actually about. Even more important, Bawcombe says, is that the old design didn’t differentiate between major and minor news: each story got the same visual treatment. “There is no feeling of hierarchy or urgency when news is breaking,” she told me.15 Finally, the old design divided stories into “news” and “more,” where the “more” stories were those that NPR thought were interesting and unique, such as analyses, reviews, or educational pieces. But clustered under that generic label, these pieces were easy to gloss over.

  The team agreed these were important design problems to solve, and they decided to explore a few different ways of doing so. In one iteration, the app displayed a stream of recent stories using a “tile” or “card” design—a technique that was popularized by sites like Pinterest, where every individual item is displayed within its own container, and that was already in use on the NPR website. Each tile was designed to be the width of a user’s smartphone, while the length varied according to how much content needed to fit. That content included a headline, a short “teaser” (a common industry term for a short, one-sentence introduction), and usually a small image. News stories were interspersed with lighter features, and the images for those were often larger, highlighting their human-interest side. All said, about one-and-a-half story tiles could display on a smartphone screen at any given time.

  That’s where the problems started. The design team realized that when users wanted breaking news, those feature stories got in the way—and the overall design required way too much scrolling to understand. But they didn’t want to end up back where they started: with a big list of stories that was easy to skim but made it difficult to see whether anything critical was happening.

  By thinking about stress cases, the team arrived at a compromise—one that works when an anxious user needs to know about urgent news right now, and also helps all those less urgent stories find their audience by providing enough nuance and context to bring in readers.

  In this version, the app loads with the top story of the moment displayed at the top in a tile that includes a headline, teaser, and larger image—providing a clear visual indicator of what’s critical right now. But for the rest of the news—whether an update on a bill passing Congress or a warning that a hurricane could hit the Caribbean—the team decided that headlines are typically clear and explanatory enough without a teaser.

  After the latest news, the design mixes in more of the feature stories. These tiles do include the larger images and teaser copy, effectively slowing down the scrolling experience for those who have the time to go past whatever’s breaking right now but might need more context to know whether an individual item is interesting enough to tap.

  All kinds of conversations have become more nuanced since the design team started talking not just about audiences, but about stress cases. For example, editorial staff already label some stories on the NPR website with phrases like “breaking news,” “developing story,” or “this just in”—but the old version of the NPR News app didn’t have space for these sorts of labels. The design team knew the new version needed to bring breaking or developing news to the surface visually. At the same time, they didn’t want the labels to cause alarm every time a developing story was posted—but only when it was truly warranted. So the team decided to balance the intense wording of these labels with a calmer color: blue. When a story is urgent, though, an editor can override that setting, and make the label red instead. By defaulting to blue, the team is keeping a wider range of users in mind—users who need an alternative to sites where every headline shouts at them, all the time.

  These are small details, to be sure—but it’s just these sorts of details that are missed when design teams don’t know, or care, to think beyond their idea of the “average” user: the news consumer sitting in a comfy chair at home or work, sipping coffee and spending as long as they want with the day’s stories. And as this type of inclusive thinking influences more and more design choices, the little decisions add up—and result in products that are built to fit into real people’s lives. It all starts with the design team taking time to think about all the people it can’t see.

  RETHINKING PERSONAS

  And that brings me back to where we started: personas, one of the original tools developed to bring empathy into the design process. It’s a tool I’ve used many times in my career—but one that, a few years back, I started using very differently.

  It was 2013. I was sitting at a gleaming conference-room table, complete with a tray of pastries on top. Sticky notes covered the walls. Across from me sat my client, the chief marketing officer of a large professional organization. My team had been working hard on a project to overhaul their digital presence: what’s on their website, in their emails, and so on. We’d just finished a round of
research, including interviewing dozens of members about their backgrounds, habits, needs, and relationship with the organization. We’d come back that day to present one of the results of that research: personas.

  We were walking the CMO through each profile, and how it came to be—explaining that, say, “Phil” represented the minimally involved member, someone whose employer signed them up for the organization but didn’t feel connected to its mission, whereas “Amanda” was an achiever, the type who would attend every webinar she could find, if she thought it would help push her career ahead.

  We went on like this for some time, the executive nodding along as he leafed through our document. Until we reached the last persona, “Linda.” A stock photo of a fortyish black woman beamed at us from above her title: “CEO.”

  Our client put down his paper. “I just don’t think this is realistic,” he said. “The CEO would be an older white man.”

  My colleague and I agreed that might often be the case, but explained that we wanted to focus more on Linda’s needs and motivations than on how she looked.

  “Sorry, it’s just not believable,” he insisted. “We need to change it.”

  I squirmed in my Aeron chair. My colleague looked out the window. We’d lost that one, and we knew it.

  Back at the office, “Linda” became “Michael”—a suit-clad, salt-and-pepper-haired guy. But we kept Linda’s photo in the mix, swapping it to another profile so that our personas wouldn’t end up lily-white.

  A couple weeks later, we were back in that same conference room, where our client had asked us to share the revised personas with another member of his executive team. We were halfway through our spiel when executive number two cut us off.

  “So, you have a divorced black woman in a low-level job,” he said. “I have a problem with that.”

  Reader, I died.

  Looking back, both of these clients were right: most of the CEOs who were members of their organization were white men, and representing their members this way wasn’t a good plan for their future.

  But what they missed—because, I recognize now, our personas encouraged them to miss it—was that demographics weren’t the point. Differing motivations and challenges were the real drivers behind what these people wanted and how they interacted with the organization.

  We thought adding photos, genders, ages, and hometowns would give our personas a more realistic feel. And they did—just not the way we intended. Rather than helping folks connect with these people, the personas encouraged the team to assume that demographic information drove motivations—that, say, young women tended to be highly engaged, so they should produce content targeted at young women.

  Thankfully, our clients’ disagreement over the right way to present race turned into a rethinking of our whole approach. Pretty soon, we’d removed all the stock photos and replaced them with icons of people working—giving presentations, sitting nose-deep in research materials, that sort of thing.

  I haven’t attached a photo to a persona since.

  I’m not alone in this shift. User researcher Indi Young, author of Practical Empathy and Mental Models, also advocates for designers to get rid of the demographic data used to make personas “feel real.” She writes:

  To actually bring a description to life, to actually develop empathy, you need the deeper, underlying reasoning behind the preferences and statements-of-fact. You need the reasoning, reactions, and guiding principles.16

  To get that underlying reasoning, though, tech companies need to talk to real people, not just gather big data about them. But in many tech companies, usage data is all that matters: who signed up, and what did they do once they had? And that data is, by and large, defined by demographics: women ages twenty-nine to thirty-four with household incomes over $100,000. Men thirty-five to forty-nine who live in urban areas. It’s no wonder so many companies make the same mental shortcuts that my client did, conflating demographic averages with motivations and needs. Often that’s all they have—and all they’re taught to value. But as Harvard researcher Todd Rose found, averages don’t mean nearly as much as we’re led to believe. The only thing that’s normal is diversity.

  RECLAIMING “NORMAL”

  If you’ve ever watched a show created by Shonda Rhimes—like Scandal, Grey’s Anatomy, or How to Get Away with Murder—then you might have noticed something about her casting: all three shows are fronted by women of color, and each is supported by a cast that is more diverse than you’ll find almost anywhere else in Hollywood.

  It’s all intentional. But, if you ask Rhimes, it’s not really “diversity” at play:

  I have a different word: normalizing. I’m normalizing TV. I am making TV look like the world looks. Women, people of color, LGBTQ people equal WAY more than 50% of the population. Which means it ain’t out of the ordinary. I am making the world of television look NORMAL.17

  Normalizing TV doesn’t start with casting, though. It starts in the writers’ room. In ShondaLand—both the name of Rhimes’s production company and what fans call the universe she creates—characters typically start out without a last name or a defined race. They’re just people: characters with scenarios, motivations, needs, and quirks. Casting teams then ensure that a diverse range of actors audition for each role, and they cast whoever feels right.

  This nontraditional casting approach won’t work for everything, of course: shows that engage with racial issues more directly, or where plotlines intersect with specific cultures or historical events, probably need to cast according to race. But it works in ShondaLand—a place where “normal” doctors, lawyers, and politicians lead lives of work, sex, and scandal.

  And it would work in tech too. Most of the personas and other documents that companies use to define who a product is meant for don’t need to rely on demographic data nearly as much as they do. Instead, they need to understand that “normal people” include a lot more nuance—and a much wider range of backgrounds—than their narrow perceptions would suggest.

  This lesson can’t wait. Because, as we’ll see in the coming chapters, the tech industry’s baseline assumptions about who’s worth designing for, and who isn’t, affect all kinds of things—from complex algorithms to the simplest form fields.

  Chapter 4

  Select One

  It was the summer of 2014, and I was new to the city of Philadelphia. I needed a doctor. Actually, what I needed was a birth control refill. Obtaining one meant an annual exam at the OB-GYN. So I made an appointment at a clinic that a friend recommended, and they emailed me a link to a new-patient PDF form. I started entering my answers: I don’t smoke. No pregnancies. My grandmother had a stroke.

  And then, suddenly, everything stopped.

  Have you ever been sexually abused or assaulted?

  Yes __ No __

  That’s it: no information. No indication of why they were asking or how they would use my response. Just a binary choice on a form that would end up in some medical record somewhere.

  I stared at those checkboxes until my vision blurred, thinking about how much I didn’t want to explain the sexual abuse I had survived when I was a little girl—not to a bunch of strangers, not without a reason, not on some godforsaken form.

  I looked at the boxes on my screen again. Yes or no? It’s so simple, until it isn’t. Until the choice is between opening a door to a conversation with people you don’t know and don’t trust about a topic you don’t want to explain while wearing nothing but a paper gown—or lying, like you lied back then, stuffing your shame deeper and deeper.

  I couldn’t bring myself to deny it, not this time. I checked yes.

  I went in for my appointment. “So, you were sexually assaulted,” my new doctor said. It wasn’t a question, but she waited for a response anyway. “Yes,” I replied, two beats too late. And then I waited, feeling my silence build like a wall between us. “I’m sorry that happened to you,” she said finally, awkwardly. She moved on quickly.

  I didn’t. I sat the
re, feet in the stirrups, that checkbox in my head. Yes or no?

  This was the first time I’d thought about the power a single form field can have. But it wouldn’t be the last. Because as soon as I started looking for them, I noticed that online forms were being used for all kinds of things—and causing problems for all kinds of people.

  I saw race and ethnicity menus that couldn’t accommodate people of multiple races. I saw simple sign-up forms that demanded to know users’ gender, and then offered only male and female options. I saw college application forms that assumed an applicant’s parents lived together at a single address.

  Individually, you might want to write these problems off as the “edge cases” discussed in Chapter 3. And that’s what often happens on design teams: forms and selection menus are treated like they’re no big deal, just a series of text boxes and selector buttons. Most people won’t get upset, right?

  But the more I looked, the more I saw that designers’ narrow thinking actually leaves out a huge percentage of users—particularly those who are already marginalized. And in fact, forms aren’t minor at all. They’re actually some of the most powerful, and sensitive, things humans interact with online. Forms inherently put us in a vulnerable position, because each request for information forces us to define ourselves: I am this, I am not that. And they force us to reveal ourselves: This happened to me.

  Most design teams haven’t been trained to think about forms this way, though. Instead, the tech industry has spent precious little time considering how its products make people feel when they ask for information—or whether they should be gathering so much data in the first place. And all of us pay the price.

 

‹ Prev