The Reality Bubble
Page 28
But it’s not just what we post in public that’s monitored. It’s what we look at privately as well: our searches, our likes, our posts, and our ad clicks are just a part of the digital trail we leave behind. And while these cookie crumbs may be forgotten by us after an hour online, what we have said or thought is not lost at all; it is all saved as demographic and psychographic data on the servers of the sites and apps we’ve visited and used.
IBM has estimated that each day, the average person leaves behind a five-hundred-megabyte digital footprint. That was in 2012, ancient history compared to the data collected now. With people and objects increasingly connected to the internet through Fitbits, smartwatches, and connected homes, for example, IBM’s estimate is likely a fraction of the digital footprint we leave behind today. That’s because the vast majority of the world’s information—some say up to 99.8 percent—was created in the last two years. According to one study, the digital universe will contain forty-four trillion gigabytes of data by 2020, a total output of at least 5,200 gigabytes for each person on Earth. As Stanford University professor Michal Kosinski puts it, if you imagine just one day of humanity’s data printed out on paper, double-sided in twelve-point type, the stack of paper would extend to the sun and back four times. That is a mind-boggling amount of data.
And what value does this data have? According to The Economist, “The world’s most valuable resource is no longer oil, but data.” In the first quarter of 2017, Amazon, Apple, Facebook, Google, and Microsoft netted $25 billion in profit. Amazon alone accounted for half of all online dollars spent in America. Our data is valuable because that’s how we are targeted as consumers. From the moment you turn on your computer or smartphone and begin to browse the internet, you are being invisibly followed. The new economic model is what Harvard Business School professor Shoshana Zuboff calls “surveillance capitalism.” As she writes, “The game is selling access to the real-time flow of your daily life—your reality—in order to directly influence and modify your behavior for profit. This is the gateway to a new universe of monetization opportunities: restaurants who want to be your destination. Service vendors who want to fix your brake pads. Shops who will lure you like the fabled Sirens.”
This is all done through personal data markets, the new primary business model for over a thousand companies. While we often hear of the business model of the FAANGs (the acronym given to the group that includes Facebook, Apple, Amazon, Netflix, and Google), Sarah Spiekermann, co-chair of the Institute of Electrical and Electronics Engineers’ standard on ethics in IT design, reminds us that it’s not “just Facebook and Google, Apple or Amazon that harvest and use our data….Data management platforms (DMP) such as those operated by Acxiom and Oracle BlueKai possess thousands of personal attributes and socio-psychological profiles about hundreds of millions of users.” From the moment we go online, a vast and sprawling apparatus swings into action, harvesting and sending our profiles to distant servers so that a tailored ad can be sent to us in milliseconds.
Unless you are in the advertising profession, you most likely have not heard of real-time bidding, or RTB, but it is how approximately 98 percent of all available ad space is sold on the internet. The automated bidding platform is similar to the Nasdaq, but instead of buying and selling stocks, it buys and sells you and me, or more specifically our data. For marketers, this data is digital gold.
This is how it works. Data management platforms house and aggregate what’s known as first-, second-, and third-party data. In its simplest form, first-party data is data that comes from a website’s own source—that is, its own visitors browsing and purchasing data and demographics. Second-party data is consumer data that is siphoned from a website’s partner through partnership agreements. And third-party data comes from any number of aggregate outside sources that bundle and sell our data together.
Next the data is crunched and segmented into “audiences” to provide detailed markets: say a male sports fan, aged eighteen to twenty-five, who lives in the Toronto area and has made over $500 in online purchases within the last thirty days. That data is then exported and made available for real-time bidding. When the ad exchange is searching for a twenty-year-old male in the Toronto area whose regular searches include “hockey,” this ID will pop up, allowing a sports e-tailer (a retailer selling goods on the internet) to bid on his impression so that their ad for hockey sticks shows up on websites he visits.
As for the specifics, what kicks into action is the demand side platform. As Pete Kluge, the group manager of product marketing for the Adobe Advertising Cloud, describes it, the “platform bids on each individual impression through RTB based on what is known about the user…so once that bid request comes from an ad exchange the demand side platform evaluates all the data that is known about that user and then determines the right price to bid for that individual user impression.” This all takes place at lightning-fast speed and is almost imperceptible; from the moment you begin browsing to the time it takes for the ad to be served is approximately ten milliseconds.
Our data is not only hugely valuable for the surveillance economy, it is a veritable bonanza for political purposes as well. In March 2018, Christopher Wylie, a former director of research at Cambridge Analytica, came forward to reveal that the company had harvested the Facebook profiles of eighty-seven million people, whose data was used to sway both the United Kingdom’s Brexit referendum and the 2016 US election.
All of this began as a simple offer on Mechanical Turk, an Amazon platform that allows users to make small amounts of money from participating in mundane tasks like web surveys, labelling objects in images, or watching internet videos. In this instance, the workers were asked to install a personality quiz app called “This Is Your Digital Life” for $1 to $2 per download. The requirement was simple: all you had to do was take the personality test while logged into your Facebook account. Over 270,000 people downloaded the quiz, approximately 32,000 of whom were US voters.
Users were told that the app was “a research tool used by psychologists,” but it didn’t just gather information on each test-taker who downloaded it. It also snaked through their friends’ accounts and their entire digital community employing an algorithm that targeted hundreds of data points per person. The data was then used to create micro-targeted psychological profiles. As an article in The Guardian noted, it was the digital footprints that Cambridge Analytica was after: “The algorithm…trawls through the most apparently trivial, throwaway postings—the ‘likes’ users dole out as they browse the site—to gather sensitive personal information about sexual orientation, race, gender, even intelligence and childhood trauma.”
Users were broken down into types—fearful, impulsive, or open, for instance—and political messages were created to target those specific attributes and begin to alter voter behaviour en masse. The goal of Cambridge Analytica was to persuade the undecided by using their psychology against them. Ultimately, they hijacked user data so that they could “not only read minds but change them.” That’s because, just as social media “likes” are used by advertisers to sell products to us, our profiles can also be used to learn how we can best be targeted. As Wylie suggests, while people may present different versions of themselves to their families, friends, or bosses, computers are neutral; they capture the digital trail of all of our personas. And because they have this complete picture, Wylie says, “computers are better at understanding who you are as a person than even your co-workers or your friends.”
In the digital age, our lives have become open books. Users outside the United States have for some time had the ability to find out what Facebook knows about them. The result varies from person to person depending on how long they’ve been on Facebook and how active they are. Austrian law student Max Schrems tested this out by requesting all of his personal Facebook data. He received over 1,200 pages, a literal book of data, “including old chats, pokes, and material that had been deleted years before.” The term “book” is a benign way o
f talking about our data; it makes it sound nice, kind of like a digital autobiography. Instead, it may be more helpful to call it what it really is, what each of us have on file is a record.
At least with social media companies, to some degree you opt in. Even if people do not know how their data is being used or what for, they have a sense that there is a trade-off, that they are giving something up to be able to use the service for free. Children, on the other hand, increasingly do not have that option. Since 2014, concerned parents in Colorado have been fighting what’s known as the Golden Record, essentially a detailed data profile of their child that starts in preschool and follows them all the way through college. That record is a data pipeline that charts information and behaviour over the student’s entire academic career: from test scores and attendance to family financial information, demographics, learning disabilities, mental health issues, and remedial courses and counselling or other interventions.*9 So what happens to all that data?
Education department officials say that its purpose is to “help guide parents, teachers, schools, districts and state leaders, as we work together to improve student achievement so all children graduate ready for college and career.” This sounds innocent enough were it not for the fine print that says that the data is also made available to “contracted vendors with signed privacy obligations for specified applications.”
In a 2013 video, Dan Domagala, the chief information officer of the Colorado Department of Education stated that the longitudinal information can be shared in a “hub and spoke approach” and connected with other state agencies, including Human Services and the Department of Corrections. While the Department of Education insists that the data is aggregated and no identifying information on students is given, this is of no consolation to the parents. As one parent noted, all of this extensive collection of information on her own child is inaccessible to her, and “no parent has ever been able to access their child’s Golden Record.”
The Eyes on Our Bodies
THE POLICE ARRIVED at Sylvan Abbey Funeral Home unannounced. The previous month, in March 2018, officers had shot thirty-year-old Linus Phillip Jr. at a gas station. They had pulled him over for having tinted windows. The officers said they also found drugs in the car and that when Phillip tried to flee during the search, they shot him four times. Now the officers had come to the funeral home because they wanted access to his phone. They demanded that Phillip’s corpse be pulled out of cold storage and used his finger to try to unlock his iPhone.
The attempt did not work, likely because the iPhone’s fingerprint sensor reverts to a passcode after forty-eight hours. But the family was outraged by the brutal invasion of privacy. For Phillip’s fiancée, not only had her partner been wrongfully murdered, he was now being “disrespected and violated” at the funeral home. The Florida police, however, had a more clinical approach. According to the law, they said, the dead did not have privacy rights. This is true. Using dead people’s fingerprints in police work is fairly routine.*10
To protect the public, security experts have been working on creating an “un-hackable” biometric system. For many, this is the holy grail, but to date every effort to create biometric security has been outstepped. That’s because when the human body becomes the password, someone will find a way to crack it. In Malaysia, one gang was more literal than most when it came to their hacking method. When stealing a Mercedes-Benz that required unlocking using a fingerprint sensor to start the engine, they hacked off the owner’s index finger with a machete so they could drive it away. There are of course subtler (and thankfully less violent) methods. At Michigan State University, experts have taken 2-D fingerprints, printed them on conductive paper so that they showed electrical conductivity, and unlocked phones. Every year, however, the biometric readers get more complex. The latest fingerprint readers use infrared scanners to detect vein patterns under the skin’s surface and require active blood circulation to certify a person’s identification.
In the Reality Bubble, the average person may not know much about how the system works, but the goal of the system is to know everything it possibly can about the average person. Biometrics is the final frontier of surveillance: the human body. It allows machines to read and identify human beings. Fingerprints are not just required of criminals or convicts anymore, because in a sense we are all suspects now. Fingerprints are used to unlock phones, provide access to airport lounges, and allow stock traders to use their mobile apps. Other parts of our bodies are being mapped and entered into databases as well. The NEXUS system uses iris scanners for trusted travellers to clear customs quickly. MasterCard offers facial recognition so that people can “snap a selfie” and shop with their face. Banks use voice recognition to verify account holders. And now even our silence is under surveillance: you can be identified and tracked by the way you breathe. Scientists have found ways to “fingerprint” inhalations and exhalations, because our vocal passageways and lung capacities are individual to each of us. It is possible to use an algorithm to match a person to their “intervocalic breath sounds” over the phone, meaning we can be monitored and identified even if we don’t speak.
The big question is, why are there eyes above us, around us, in our minds, and on our bodies? Why as a society are we being so deeply and intrusively surveilled?
IBM was the first company to use its tabulator machines to sort people into groups. In 1933, the company formed a twelve-year partnership with the Nazis, and its simple punch cards, called Hollerith cards, were used to organize and segregate concentration camp prisoners. As Edwin Black, author of IBM and the Holocaust, writes,
The codes show IBM’s numerical designation for various camps. Auschwitz was 001, Buchenwald was 002; Dachau was 003, and so on. Various prisoner types were reduced to IBM numbers, with 3 signifying homosexual, 9 for anti-social, and 12 for Gypsy. The IBM number 8 designated a Jew. Inmate death was also reduced to an IBM digit: 3 represented death by natural causes, 4 by execution, 5 by suicide, and code 6 designated “special treatment” in gas chambers. IBM engineers had to create Hollerith codes to differentiate between a Jew who had been worked to death and one who had been gassed.
Even this early system was subject to hacking. And the first ethical hacker was René Carmille, the comptroller general of the French Army, who headed up the French census before the Germans invaded. The Germans instructed Carmille to input census data into IBM machines and have it analyzed to produce a full list of Jews living in France. Carmille and his team had a different idea. They hacked the punch card machines so that data could not be entered for the column that specified religion. His sabotage worked until 1944, when the Nazis discovered the plot. Carmille was tortured and sent to the Dachau concentration camp, and he died shortly thereafter.
As digital law expert Heather Burns notes, this one small hack to the system had a lasting legacy: “In the Netherlands, 73% of Dutch Jews were found, deported, and executed. In France, that figure was 25%. It was that much lower because they couldn’t find them. They couldn’t find them because René Carmille and his team got political and hacked the data.”
Data is how we segment and survey society. David Lyon, author of Identifying Citizens, makes the case clearly. Why are we turning people into data? Because “identification is the starting point of surveillance.” It allows systems to sort people into groups that can by analyzed, classified, and, according to whatever data has been collected, rewarded or discriminated against.
The philosopher and economist David Hume famously remarked in 1741 that “nothing appears more surprising…than the easiness with which the many are governed by the few.” And it is surprising. We take orders because our lives have already been ordered in ways that we rarely think about. Every morning, millions of people seemingly automatically get in their cars to commute to work, because time has become so ingrained in our minds that we do not question it; we simply obey it. In a similar vein, imagine how baffled First Nations people were when they first saw that American
troops stopped at the Medicine Line, the Canadian border at the forty-ninth parallel, while they could pass through. To their eyes, it was as though the troops had to stop because they were possessed by magic. But the effect was not so much magic as it was magical thinking. Today we all obey these imaginary lines because the importance of borders has been ingrained. As Yuval Noah Harari puts it in Sapiens, “People are willing to do such things when they trust the figments of their collective imagination.” But these systems are no longer just imagined; like Pinocchio, the creations are becoming real.
If, in the past, the day-to-day march, the orders and controls, were accomplished through hegemony and the acquiescence of our beliefs, today our collective beliefs about how clocks and borders divide up the dimensions of the world have been formalized into a grid by which our lives are plotted. This is the infamous “system” that is everywhere but remains hidden to us, and unseen. Here, our measurements of time and space have entered the real world as data and our digital doppelgängers are placed under constant surveillance; we have become the “dots” on the grid, leaving a digital trail behind. In this virtual world, all of us are being followed. But if there’s one element missing from the grid, it is our own flesh and blood. To truly integrate our bodies with the system, we too must become data. Biometrics provides that digital leash, allowing our real, physical bodies to be monitored and controlled in the real, physical world.
The Eyes with Their Own Mind
FOR THE POLICE, searching for Mr. Ao was like searching for a needle in the proverbial haystack. Over fifty thousand people were buzzing with excitement at Jacky Cheung’s concert in the city of Nanchang in East China, and there, huddled in the crowd, Ao and his wife felt safe. But not long after the couple arrived, facial recognition cameras began scanning the pop concert arena. Moments later, Ao was spotted and arrested. The newspapers reported that he was taken away and charged with committing unspecified “economic crimes.”