Book Read Free

Pixels and Place

Page 9

by Kate O'Neill


  All of this also brings up questions of what the power dynamic should be between service provider and customer.

  Of what I describe today, an awful lot of that kind of information becomes a kind of brokered power, and it could potentially and fundamentally change the relationship dynamic between company and customer.

  Google and Facebook have extensive information about us, much more so than most people realize. Google owns the bulk of the advertising landscape distributed across the web, and they can track data across websites that use its AdSense product. Facebook has access to a great deal of that data, too, due to its advertisers and potential advertisers integrating a tracking “pixel,” which passes data about user movement back to them.

  We’ve already become culturally acclimated to the idea that we hand over a lot of data. As brand owners, designers, and data advocates, we often take for granted that customers, users, patients, students, and all of our constituents will hand over their data in exchange for our services. That is often true. But we’re duty bound to be intentional in our obligation to be responsible, mindful of the ethical burden that data places on us, and respectful of the power and trust that is exchanged there.

  ***

  We make the data points. We are the data points. We don’t yet have a framework for how to treat and appreciate the richness of the humanity that’s represented in those data points.

  A framework for that must include a data model that acknowledges both relevance and discretion as respect. Relevance alone is an insufficient guide. Imagine that you blurted out everything you knew about everyone you knew in any setting. Pretty quickly, you would have no friends.

  In a parallel way, marketers shouldn’t expect more intimate insights than they need; and they can’t expect to be able to play every card they’ve got all the time. Too much targeting goes beyond the filter bubble and starts to fatigue the customer.

  Moreover, in designing projects and data models, it would really behoove companies to consider the data they’ll truly need to collect.

  I suppose it’s too much to ask for companies to avoid collecting data that’s right there in front of them, even if they don’t need it; but it becomes an ethical burden when the company has access to unnecessary data that could easily be used to manipulate. Such is the case with Uber and their customers’ battery levels.

  Of course the data itself has no motive. It’s only in the framework of the data collector’s motive that it matters. Knowing whether you’re an ex-smoker isn’t terribly ethically burdensome in the context of encouraging a friend on Facebook who’s quitting smoking by saying that you quit smoking ten years ago. But if that Facebook comment is visible to an entity that is harvesting that kind of information and passing it along to your health insurance provider, then what is that? In such cases, how can an insurance company be incentivized not to use that data, other than regulations that might restrict it? Market forces will want access to whatever data they can get.

  This is why environmental tracking (such as beacons) presents such a quandary and such a need for this level of reflection. On one hand, there are potentially hundreds of legitimate use cases that can simplify customer experience and streamline company costs.

  On the other hand, what a can of worms that opens up.

  Adding Value Instead of Over-Optimizing

  With all this data, there’s going to be a very understandable inclination to try to optimize for everything. Organizations, however, aren’t very good at optimizing along multiple dimensions, and humans who have to carry out the work of these distracted and distractible organizations get frustrated when they don’t have clear direction.

  There’s a simple solution: Focus the organization around adding value to the human experience as it incrementally works its way toward improved experiences. Whatever products a company develops, whatever campaigns it launches, whatever communications it has with its customers, the underlying goal should be focusing upon the value in the relationship. Make the value clearer to prospective customers earlier on; remove value-diminishing parts of transacting with the company; and emphasize long-term value in existing customer relationships.

  In practice, this often involves the same tactics that would be part of a more optimization-obsessed company-centric process—such as optimizing checkout flows to remove friction and increase conversion rate. But the tacit knowledge that comes from optimizing for value is subtly different from what comes from optimizing simply to optimize. A team can easily go off-mission in the latter scenario. Testing road maps are harder to prioritize. Conflicting data sets are harder to reconcile. But with an emphasis on building value in customer relationships, the team has confidence to navigate these scenarios.

  ***

  Years ago, I was hired as managing director for a shop that was looking to evolve from design and SEO to more of a full-service digital marketing agency. Weeks before I’d been hired, they’d completed a website redesign for a client that offered marriage counseling retreats. In the first conference call I sat in on with this client, they were bringing up concern because they noticed that their website lead generation rate—which, in their case, meant completed inquiry forms from a person seeking help for a troubled marriage—had dropped precipitously since the redesign. No one could explain why: After all, the redesign followed what were considered industry best practices. It simplified the overall design, streamlined visitor access to the intake form, and cut the length of the intake form by more than half.

  On paper, the agency had done everything right. But given the results, we had to step back and think differently about it. I started to wonder if efficiency was really what visitors to this website would really value most, or if in the interest of reassuring the visitors, we could afford to slow the process down. So we tested it. We reinstated the longer version of the form, still wrapped in a cleaner design; and as you may have guessed, conversion rate went back up. The longer, more narrative form—which included questions like, “When did you first start to notice problems in your marriage?”—may have given the inquiring visitor a sense of being cared for, a sense that they were entering therapy already. Moreover, it provided the retreat service more context when they followed up with the prospective attendee, giving the service a better chance to empathetically connect with the prospective clients.

  At a glance, this story may not have anything to do with physical place or connected experiences, but it underscores a fundamental truth in dealing with humans. Our need to be shown respect can override our appreciation for things like efficiency, simplicity, or even low cost. If an interaction demonstrates respect, or even a proxy for respect—such as relevance, discretion, or modeled empathy through an immersive inquiry form—we are likely to recognize and appreciate it. It’s easy to imagine what analogous connected experiences might look like. Perhaps messages or offers are targeted based on proximity alone, but they overstep the comfort a person has with the brand or service and the interaction fails. Connected experiences have tremendous opportunities to bypass respect and overplay their hands, so to speak.

  Privacy and Data in Place

  In the larger conversation about the convergence of physical and digital experience, and given what we know about our digital selves being our aspirational selves, it’s critically important that we think about the role of privacy in experience design.

  It’s also worth considering that some new technologies come into our lives as opt-in, meaning we’re outside the system until we choose to opt into it. Others come into our lives as opt-out, meaning we’re in unless we say otherwise. Location tracking and targeting has largely entered public life in an opt-out model, in the sense that people buy sophisticated smartphones all the time and in their early iterations, location tracking was either on by default or ambiguously worded. People were using them as intended, which means they’d turn on location services, enable apps to use location, and then not necessarily connect the dots that the push notifications they got in the mall were becau
se of the data they were voluntarily sharing.

  Google was criticized for its location history in Android devices, mostly because the wording on the settings screen promised that the data was “anonymous” even though it was tied to the user’s Google account.

  Technologies like facial recognition present emerging privacy considerations and concerns, especially within the context of augmented reality and its potential widespread use.

  There was a Google Glass app to identify emotions based on people’s facial expressions and gender in 2014, and it could display this information in the user’s augmented view.26 There was even a surveillance video sharing program called Facewatch that integrated with security systems to identify people’s likely emotional states.27

  As ominous as some of this sounds, the underlying technology can be incredibly useful stuff in the right context. In later sections of this book, we’ll explore using similar algorithms to tailor experiences and offers in ways people might consider worthwhile. The key, as a customer, as a citizen, and, yes, as a human, is that you have perceive sense of value (whether in the form of safety, convenience, or something else) in what you’re signing up for in exchange for what you’re disclosing.

  The Convenience—Privacy—Access Relationship

  People generally don’t want to be bothered by messages or experiences that don’t align with their existing motivations. And when asked, most will say that they don’t want to give up their privacy or personal data.

  But if you offer someone access to something restricted in exchange for their personal data, chances are, if it’s interesting enough to them, they’ll make the trade. There’s a catch, though: They have to be reasonably assured that their security won’t be compromised in making the trade.

  There’s a whole economy built around the principle of transparency in exchange for trust, of access in exchange for protection. Just because consumers are willing to give up information in the moment doesn’t mean they always will, doesn’t mean you’re asking for the right level of detail, doesn’t mean there won’t be legislation in the future that restricts what you can collect and use.

  Consumers need better education and better protection, and this is not that book. But business and organizations need better strategy to collect what is meaningful, use it respectfully, and guard it carefully. They need to organize a plan to collect what they’re going to collect based on a sense of what can provide insights and maximum alignment, in order to create the most meaningful experiences.

  That means thinking about the data model in new and different ways. It’s subtle, but it might mean thinking about the underlying relationship between entities in your database.

  There are strong relationships and weak relationships; some things need to be closely connected in the context of the business objectives overall, while other pieces of data are peripheral and incidental and can be modeled with separation to keep them from bogging down the data model.

  For example, you may be collecting patient information in one context that pertains to your ability to contact the patient and follow up on appointments and scheduling. In another context, you may be collecting their medical history, reported symptoms, and test results. These need to be related, but their relationship should be carefully considered so that access to the different contexts can be controlled and safeguarded. I once heard Noah Harlan, a mobile technology entrepreneur and digital strategist, helpfully distinguish privacy and security by saying, “Privacy is who has unauthorized access to your data; security is who has authorized access to your data.”

  The Pew Research Center’s report on customer privacy and data sharing demonstrates customers’ complicated attitudes toward the data trade-off. The big question is what constitutes “tangible benefits” for customers:

  While many Americans are willing to share personal information in exchange for tangible benefits, they are often cautious about disclosing their information and frequently unhappy about what happens to that information once companies have collected it. For example, when presented with a scenario in which they might save money on their energy bill by installing a “smart thermostat” that would monitor their movements around the home, most adults consider this an unacceptable tradeoff (by a 55% to 27% margin). As one survey respondent explained: “There will be no ‘SMART’ anythings in this household. I have enough personal data being stolen by the government and sold [by companies] to spammers now.28”

  This is why people are willing to use Google’s free email, despite a background level of awareness that their data might be used by Google. But when it comes to the data exchanged in physical spaces, people tend to get a little more creeped out.

  The Pew study further demonstrates that context matters to people in terms of their comfort level with the exchange of data for access and security.

  Certain realms are not inherently private and different rules about surveillance and sharing apply. Certain physical spaces or types of information are seen as inherently less private than others. One survey respondent noted how these norms influence his views on the acceptability of workplace surveillance cameras: “It is the company’s business to protect their assets in any way they see fit.”29

  Social media is used primarily for people to communicate with friends. Yet social media ranks at the bottom of trust rankings:

  Last week, while preparing a lecture, I searched quickly for some charts showing survey results for social media usage and privacy. The huge mismatch between what people want and what they are getting today was stunning. Social media I learned are used mainly for communicating with relatives and close friends with the next most significant use relating to political discourse. Yet social media was at the bottom of the trust rankings, with only 2 percent confidence. About 70 percent of respondents said they are very concerned about privacy and protection of their data.30

  The problem with the model is that online advertising in particular relies heavily on transparent customer data. Once again, as the saying goes, if you’re not paying for the product, you are the product. Consumers increasingly do know this, and where there are privacy concerns, some are taking protections. Some might set up fake email addresses and social media profiles to take advantage of certain offers without the risk of compromising their “real” data stores.

  This complicates the advertising-supported business model, which relies on targeting offers to people who show an affinity for a certain idea or product. Security expert Tim Hwang argues,

  The issue with privacy in particular is that the predominant business model for many of the companies that hold some of the most sensitive data online is based on advertising as the primary revenue model, which incentivizes companies to collect more data so they can use data more aggressively to promote adverts, to make more money and stay alive. Sometimes these companies will use the data in ways that their users may be uncomfortable with and one of the key challenges they face is how to balance advertisers and users, and the privacy implications of how they draw the line between the two.31

  One of the less-cited but most valuable reasons for companies to strive for a meaningful approach to data collection and use is because it fosters an honest, authentic relationship between company and customer. The company that aligns its messaging and experiences with its motivations and the customers’ motivations is setting the stage for a connection that promotes trust, and the customer who trusts the safety of the interaction is not guarded or misleading in their intentions. The customer data that is collected under those circumstances is bound to be cleaner and more informative.

  Or, to return to Tim Hwang:

  Authenticity is the common goal, both for users and for platforms. Authenticity is when you have struck the right balance of data collection and security measures. Users want to be authentic online, they want to feel like they are not constrained because they are being surveilled and they can’t act in a way that they want. Platforms also want that because it provides them with valuable data. The data of a c
onsumer that is actively hiding things from you is not going to be very useful data, particularly from an advertising standpoint.32

  The thing is, we may say we’re creeped out by data and technology, but if it’s used to create compelling experiences, to anticipate our needs, to deliver value, and to do so with a sense of safety and privacy, very few of us will object.

  So the mission is to develop an approach to handling consumer data, patient data, student data—all of it—in a way that offers the most value to the person while bringing the most value to your business. That’s going to take some strategic planning. It’s going to take an understanding of the underlying framework that drives those interactions.

  We’ll get there by looking more closely in the next chapter at how to think about place.

  CHAPTER SEVEN

  Metaphors: Digital Experience Through the Lens of Place

  Thinking about digital experiences as virtual places is not new; many of the metaphors we use to describe online functions and interactions have been borrowed from the physical world. But applying these terms, like traffic or page, we carry over metaphorical associations in subtle ways. Examining these metaphors and their associations can give us new insights into what we’ve come to expect from digital spaces, and how to open up opportunities to create more integrated, connected experiences. We’ll look more closely at them throughout this next section.

 

‹ Prev