The web promises to make our world bigger. But as it works now, it also narrows our exposure to ideas. We can end up in a bubble in which we hear only the ideas we already know. Or already like. The philosopher Allan Bloom has suggested the cost: “Freedom of the mind requires not only, or not even especially, the absence of legal constraints but the presence of alternative thoughts. The most successful tyranny is not the one that uses force to assure uniformity, but the one that removes awareness of other possibilities.”
Once you have a glimmer—and you only need a glimmer—of how this works, you have reason to believe that what the web shows you is a reflection of what you have shown it. So, if anti-abortion advertisements appear on your social media newsfeed, you may well ask what you did to put them there. What did you search or write or read? Little by little, as new things show up on the screen, you watch passively while the web actively constructs its version of you.
Karl Marx described how a simple wooden table, once turned into a commodity, danced to its own ghostlike tune. Marx’s table, transcendent, “not only stands with its feet on the ground . . . it stands on its head, and evolves out of its wooden brain, grotesque ideas far more wonderful than ‘table-turning’ ever was.” These days, it is our digital double that dances with a life of its own.
Advertising companies use it to build more targeted marketing campaigns. Insurance companies use it to apportion health benefits. From time to time, we are startled to get a view of who the algorithms that work over our data think we are. Technology writer Sara Watson describes such a moment. One day, Watson receives an invitation, a targeted advertisement, to participate in a study of anorexia in a Boston-area hospital. Watson says, “Ads seem trivial. But when they start to question whether I’m eating enough, a line has been crossed.”
Watson finds the request to participate in the anorexia study personal and assaultive, because she is stuck with the idea that she made the invitation appear. But how? Is the study targeting women with small grocery bills? Women who buy diet supplements? We are talking through machines to algorithms whose rules we don’t understand.
For Watson, what is most disorienting is that she doesn’t understand how the algorithm reached its conclusion about her. And how can she challenge a black box? For the algorithms that build up your digital double are written across many different platforms. There is no place where you can “fix” your double. There is no place to make it conform more exactly to how you want to be represented. Watson ends up confused: “It’s hard to tell whether the algorithm doesn’t know us at all, or if it actually knows us better than we know ourselves.” Does the black box know something she doesn’t?
In conversations with others over a lifetime, you get to see yourself as others see you. You get to “meet yourself” in new ways. You get to object on the spot if somebody doesn’t “get you.” Now we are offered a new experience: We are asked to see ourselves as the collection of things we are told we should want, as the collection of things we are told should interest us. Is this a tidier version of identity?
Building narratives about oneself takes time, and you never know if they are done or if they are correct. It is easier to see yourself in the mirror of the machine. You have mail.
Thinking in Public
Thoreau went to Walden to try to think his own thoughts, to remove himself from living “too thickly”—how he referred to the constant chatter around him in society. These days, we live more “thickly” than Thoreau could ever have imagined, bombarded by the opinions, preferences, and “likes” of others. With the new sensibility of “I share, therefore I am,” many are drawn to the premise that thinking together makes for better thinking.
Facebook’s Zuckerberg thinks that thinking is a realm where together is always better. If you share what you are thinking and reading and watching, you will be richer for it. He says that he would always “rather go to a movie with [his] friends” because then they can share their experience and opinions. And if his friends can’t be there physically, he can still have a richer experience of the movie through online sharing. Neil Richards, a lawyer, cross-examines this idea. Always sharing with friends has a cost.
It means we’ll always choose the movie they’d choose and won’t choose the movie we want to see if they’d make fun of it. . . . If we’re always with our friends, we’re never alone, and we never get to explore ideas for ourselves. Of course, the stakes go beyond movies and extend to reading, to web-surfing, and even thinking.
And even thinking. Especially thinking. One student, who was used to blogging as a regular part of her academic program for her master’s degree, changed styles when she changed universities and began her doctoral studies. In her new academic program blogging was discouraged. She comments that, looking back, the pressure to continually publish led to her thinking of herself as a brand. She wanted everything she wrote to conform to her confirmed identity. And blogging encouraged her to write about what she could write about best. It discouraged risk taking. Now, writing privately, she feels more curious. Research shows that people who use social media are less willing to share their opinions if they think their followers and friends might disagree with them. People need private space to develop their ideas.
Generations of Americans took as self-evident the idea that private space was essential to democratic life. My grandmother had a civics lesson ready when she talked about the privacy of my library books. In order to be open to the widest range of ideas, I had to feel protected when making my reading choices. “Crowdsourcing” your reading preferences, says Richards, drives you “to conformity and the mainstream by social pressures.”
Objects-Not-to-Think-With
Cognitive science has taught us the several qualities that make it easy not to think about something you probably don’t want to think about anyway. You don’t know when it is going to “happen.” You don’t know exactly what it means for it to “happen.” And there is no immediate cause and effect between actions you might take and consequences related to the problem.
So, for example, if you don’t want to think about climate change, you are able to exploit the psychological distance between a family vacation in an SUV and danger to the planet. A similar sense of distance makes it easy to defer thinking about the hazards of “reading in public,” the risks of living with a digital double, and threats to privacy on the digital landscape.
Here is Lana, a recent college graduate, thinking aloud about how she doesn’t think about online privacy:
Cookies? I think that companies make it hard to understand what they are really doing. Even calling them cookies seems pretty brilliant. It makes it sound cute, like it’s nothing. Just helpful to you. Sweet. And it is helpful to get better ads or better services for the things you want. But how do they work and what are they going to do with all that they know about you? I don’t know and I don’t like where this is going. But I’m not going to think about this until something really bad happens concretely.
Lana is uneasy that data are being collected about her, but she’s decided that right now she’s not going to worry about it. She says that when she was younger she was “creeped out” by Facebook having so much information about her, but now she deals with her distrust of Facebook by keeping her posts light, mostly about parties and social logistics. She doesn’t want what she puts on Facebook “coming back to haunt me.”
More than this, Lana says, she “is glad not to have anything controversial on my mind, because I can’t think of any online place where it would be safe to have controversial conversations.” And she would want to have any conversation online because that is where she is in touch with all her friends. Lana describes a circle that encourages silence: If she had controversial opinions she would express them online, so it’s good that she has none, because what she would say would not be private in this medium. In fact, Lana’s circle has one more reinforcing turn: She says it’s good that she has nothing
controversial to say because she would be saying it online and everything you say online is kept forever. And that is something she doesn’t like at all.
I talk to Lana shortly after her graduation from college in June 2014. In the news are manifestations of disruptive climate change, escalating wars and terrorism, the limitations of the international response to the Ebola epidemic, and significant violence due to racial tensions. There is no lack of things to communicate about “controversially.” Yet this very brilliant young woman, beginning a job in finance, is relieved not to have strong opinions on any of this because her medium for expressing them would be online and there is no way to talk “safely” there.
But Lana does not say that she finds any of this a problem. It would be inconvenient to label it that way. If you say something is a problem, that suggests you should be thinking about changing it and Lana is not sure that this is the direction she wants to take her feelings of discontent, at least not now. Right now, as for many others, her line is that “we all are willing to trade off privacy for convenience.”
She treats this trade-off as arithmetic—as if, once it’s calculated, it doesn’t need to be revisited.
Vague on the Details
When I talk to young people, I learn that they are expert at keeping “local” privacy—privacy from each other when they want to keep things within their clique, privacy from parents or teachers who might be monitoring their online accounts; here they use code words, a blizzard of acronyms. But as for how to think about private mindspace on the net, most haven’t thought much about it and don’t seem to want to. They, like the larger society, are, for the most part, willing to defer thinking about this. We are all helped in this by staying vague on the details.
And the few details we know seem illogical or like half-truths. It is illegal to tap a phone, but it is not illegal to store a search. We are told that our searches are “anonymized,” but then, experts tell us that this is not true. Large corporations take our data, which seems to be legal, and the government also wants our data—things such as what we search, whom we text, what we text, whom we call, what we buy.
And it’s hard to even learn the rules. I am on the board of the Electronic Freedom Foundation, devoted to privacy rights in digital culture. But it was only in spring 2014 that an email circulated to board members that described how easy it is to provoke the government to put you on a list of those whose email and searches are “fully tracked.” For example, you will get on that list if, from outside the United States, you try to use TOR, a method of browsing anonymously online. The same article explained that from within the United States, you will also activate “full tracking” if you try to use alternatives to standard operating systems—for example, if you go to the Linux home page. It would appear that the Linux forum has been declared an “extremist” site.
One of my graduate research assistants has been on that forum because she needed to use annotation software that ran only on Linux. When she reads the communiqué about Linux and full tracking, she is taken aback, but what she says is, “Theoretically I’m angry but I’m not having an emotional response.” According to the source we both read, undisputed by the NSA, the content of her email and searches is surveilled. But still, she says, “Who knows what that means. Is it a person? Is it an algorithm? Is it tracking me by my name or my IP address?”
Confused by the details, she doesn’t demand further details. Vague understandings support her sense that looking into this more closely can wait. So does the idea that she will be blocked or perhaps singled out for further surveillance if she tries to get more clarity.
One college senior tells me, with some satisfaction, that he has found a way around some of his concerns about online privacy. His strategy: He uses the “incognito” setting on his web browser. I decide that I’ll do the same. I change the settings on my computer and go to bed thinking I have surely taken a step in the right direction. But what step have I taken? I learn that with an “incognito” setting I can protect my computer from recording my search history (so that family members, for example, can’t check it), but I haven’t slowed down Google or anyone else who might want access to it. And there is the irony that articles on how to protect your privacy online often recommend using TOR, but the NSA considers TOR users suspect and deserving of extra surveillance.
I come to understand that part of what sustains apathy is that people think they are being tracked by algorithms whose power will be checked by humans with good sense if the system finds anything that might actually get them into trouble. But we are in trouble together. Interest in Linux as probable cause for surveillance? We’re starting not to take ourselves seriously.
My research assistant says she’s not worried about her data trail because she sees the government as benign. They’re interested in terrorists, not in her. But I persist. Now that my assistant knows she is subject to tracking because of her activity on the Linux forum, will it have a chilling effect on what she says online? Her answer is no, that she will say what she thinks and fight any attempt to use her thoughts against her “if it should ever come to that.” But historically, the moments when “it came to that” have usually been moments when it has been hard or too late to take action.
I recall how Lana summed up her thoughts about online privacy: She said she would worry about it “if something bad happens.” But we can turn this around and say that something bad has happened. We are challenged to draw the line, sometimes delicate, between “personalization” that seems banal (you buy shoes, so you see ads for shoes) and curation that poses larger questions.
In the 2012 presidential election, Facebook looked at random precincts and got people to go to the polls by telling them that their friends had voted. This political intervention was framed as a study, with the following research question: Can social media affect voter turnout? It can. Internet and law expert Jonathan Zittrain has called the manipulation of votes by social media “digital gerrymandering.” It is an unregulated threat. Facebook also did a study, a mood experiment, in which some people were shown posts from happy friends and some people were shown posts from unhappy friends to see if this changed their moods. It did. Social media has the power to shape our political actions and emotional lives. We’re accustomed to media manipulation—advertising has always tried to do this. But having unprecedented kinds of information about us—from what medications we take to what time we go to bed—allows for unprecedented interventions and intrusions. What is at stake is a sense of a self in control of itself. And a citizenry that can think for itself.
Snowden Changes the Game
I have been talking to high school and college students about online privacy for decades. For years, when young people saw the “results” of online data collection, chiefly through the advertisements that appeared on their screens, it was hard for them to see the problem. The fact that a desirable sneaker or the perfect dress popped up didn’t seem like a big deal. But in the years since Edward Snowden’s revelations about how the government tracks our data, young people are more able to talk about the problems of data mining, in some measure because it has become associated (at least in their minds) with something easier to think about: spying. What Snowden was talking about seemed enough like old-fashioned spying that it gave people a way into a conversation about the more elusive thing: the incursions of everyday tracking.
So, after the Snowden revelations, high school students would begin a conversation about Snowden and then pivot to “Facebook knowing too much.” What did Facebook know? What did Facebook keep? And had they really given it permission to do all this?
Or they would begin a conversation by talking about how they were trying to stay away from Facebook, now a symbol of too much online data collection, and then pivot to Snowden. A different set of issues entirely, but Snowden gave them a handle on their general sense of worry. The worry, in essence: How much does the Internet “know” and what is the Internet
going to do about it? After Snowden, the helpful ads on their screens had more of a backstory. Someone, many someones, knows a lot more about them than their sneaker preferences.
And yet it is easy for this conversation to slip away from us. Because just as we start to have it, we become infatuated with a new app that asks us to reveal more of ourselves: We could report our moods to see if there are concerns to address. We could track our resting heart rate or the amount of exercise we get each week. So we offer up data to improve ourselves and postpone the conversation about what happens to the data we share. If someday the fact that we were not careful about our diet in our forties is held against us—when it comes to giving us an insurance rate in our fifties—we will have offered up this data freely.
Instead of pursuing the political conversation, we sign up for another app.
Technology companies will say—and they do—that if you don’t want to share your data, don’t use their services. If you don’t want Google to know what you are searching, don’t search on Google. When asked to comment on all that Google knows, its executive chairman said, in essence, that “the way to deal is to just be good.”
I have felt for a long time, as a mother and as a citizen, that in a democracy, we all need to begin with the assumption that everyone has something to “hide,” a zone of private action and reflection, a zone that needs to be protected despite our techno-enthusiasms. You need space for real dissent. A mental space and a technical space (those mailboxes!). It’s a private space where people are free to “not be good.” To me, this conversation about technology, privacy, and democracy is not Luddite or too late.
Reclaiming Conversation Page 33