Black Code: Inside the Battle for Cyberspace
Page 5
In 2002, Mark Klein, a twenty-year veteran technician with AT&T, was working at an IXP in San Francisco. He became suspicious after noticing some unusual activity in a “secure room” marked 641A. Klein was working in an adjacent area and had been instructed to connect fibre-optic cables to cables exiting from the secure room. He was not allowed to enter the room, and the people there were not the type of workers with whom Klein enjoyed lunch and coffee breaks. They kept to themselves and seemed to have special privileges. Later, Klein learned from his colleagues that similar operations were observed by engineers at other AT&T facilities across the United States.
Klein’s suspicions eventually led to a class action lawsuit by the Electronic Frontier Foundation (EFF) against AT&T, alleging that the company had colluded with the National Security Agency (NSA) outside of the rule of law. As it turned out, inside room 641A was a data-mining operation involving a piece of equipment called Narus STA 6400, known to be used by the NSA to sift through large streams of data. The choice of location was significant. Because of the complex routing arrangements that govern the flow of traffic through cyberspace, many smaller ISPs sublease their traffic through AT&T – a globe-spanning “Tier 1” telecommunications company – and a large proportion of global communications traffic flows through its pipes. The AT&T-operated IXP in San Francisco is one of the world’s most important chokepoints for Internet communications.
The IXP is a chokepoint for not only international traffic; it handles a large volume of domestic U.S. communications as well. The NSA is prohibited from collecting communications from American citizens, and the data-mining operation at the AT&T facility strongly suggested that prohibition was being ignored. The EFF class action lawsuit took AT&T and another IXP operator, Verizon, to task for their complicity with what turned out to be a presidential directive instructing the NSA to install the equipment at key IXPS in order to monitor the communications of American citizens. In 2008, as the lawsuit dragged on, the Bush administration took pre-emptive action by introducing a controversial amendment to the Foreign Intelligence Services Act (FISA), giving telecommunications companies retroactive immunity from prosecution if the attorney general certified that surveillance did not occur, was legal, or was authorized by the president. This certification was filed in September of 2008 and shortly thereafter, the EFF’S case was dismissed by a federal judge citing the immunity amendment. (Presidential candidate Barack Obama surprised many of his supporters by backing the FISA Amendment Act, and his administration has vigorously blocked court challenges against it ever since.) Although the full scope of the NSA’S warrantless wiretapping program (code-named “Stellar Wind”) is classified, William Binney, a former NSA employee who left the agency in protest, estimates that up to 1.5 billion phone calls, as well as voluminous flows of email and other electronic data, are processed every day by the eavesdropping system stumbled upon by Klein.
IXmaps, a research project at the University of Toronto, raises awareness about the surveillance risks of IXPS, particularly for Canadians. The project uses trace-routing technology to determine the routes discrete bits of information (or “packets”) take to reach their destination over the Internet. In one example, IXmaps detailed the route of an email destined for the Hockey Hall of Fame in downtown Toronto and originating at the University of Toronto a few miles away. The email crossed into the United States, was peered at an IXP in Chicago, and was probably exposed to one of the NSA’S warrantless surveillance systems rumoured to be located at the facility. Known as boomerang traffic, this type of cross-border routing is a function of the fact that there are eighty-five IXPS in the U.S., but only five in Canada. Routing arrangements made by Canadian ISPs and telecommunications companies will routinely pass traffic into the U.S. and back into Canada to save on peering costs, subjecting otherwise internal Canadian communications to extraterritorial monitoring.
• • •
One of the long-standing myths about cyberspace is that it is highly resilient to disruption. For those of us who have laboured over Internet downtimes, email failures, or laptop crashes, this may seem like a fanciful idea. But the resiliency of cyberspace does have some basis in the original design principles of the Internet, whose architecture was constructed to route information along the most efficient available path and to avoid disruption in the event of a natural disaster (or nuclear attack). This resiliency was demonstrated in the aftermath of Hurricane Sandy in October 2012, which devastated the U.S. eastern seaboard and caused mass power outages, including the loss of local Internet and cellphone connectivity. The network-monitoring company Renesys showed that the storm had collateral impacts on traffic as far away as Chile, Sweden, and India – but mostly in a positive sense: traffic destined for New York City that would have failed as a consequence of the storm was manually rerouted along alternative paths by savvy network engineers.
However, there are also many characteristics of cyberspace that demonstrate fragility and a lack of resiliency; Hayastan Shakarian’s mistaken severing of an underground cable in Georgia to name one. It may come as a surprise that the same type of cables that Shakarian accidentally unearthed traverse the world’s lakes and oceans, and bind cyberspace together in a very material sense. Undersea cables are one of the links that connect today’s cyberspace to the late Industrial Revolution. The first such cables were laid in the late nineteenth century to facilitate telegraph traffic over long distances. Early designs were prone to failure and barely allowed the clicks of a telegraph exchange to be discerned across small bodies of water like the English Channel, but over time innovations in electronics and protective cable sheathings allowed the undersea cable industry to flourish. (This growth led to a dramatic increase in international telephone calls, and a new market for the sap of gutta-percha trees, which was used to coat and protect the cables until the mid-twentieth century.) Although international telecommunications have been supplemented with microwave and satellite transmissions, a surprisingly large volume of data still traverses the world through cables crossing the Atlantic and Pacific oceans, and major bodies of water like the Mediterranean Sea.
Due to the staggering costs involved, companies often share the same undersea cable trenches and sometimes competing companies even share the same protective sheathing. This makes those trenches highly vulnerable to major disruption. In a May 2012 article published on the website Gizmodo, provocatively titled “How to Destroy the Internet,” the author details the physical elements of the Internet that could be easily targeted. He provides a link to a document alphabetically listing every single cable in the world, and its landing stations. While there are hundreds of cables, the total is not astronomical – and probably a lot fewer than what most people might expect for a network as vast as the global Internet. Among them is ACS Alaska-Oregon Network (AKORN), with its landing points in Anchorage, Homer, and Nikiski, Alaska, and Florence, Oregon; the Gulf Bridge International Cable System, with its landing points in Qatar, Iraq, Bahrain, Saudi Arabia, Oman, Iran, the United Arab Emirates, Kuwait, and India; and at the end of the long list, Yellow/Atlantic Crossing-2 (AC-2), which connects New York City to Bude in Cornwall, U.K. The author goes on to explain how many of the cables’ onshore landing stations are sometimes “lying out on the sand like an abandoned boogie board,” and how the cables could be severed with a few swings of an axe. Severing cables in this way at landing stations in only a few select locations – Singapore, Egypt, Tokyo, Hong Kong, South Florida, Marseilles, Mumbai, and others – could wreak havoc on most of the world’s Internet traffic.
The 2006 Hengchun earthquake, off the coast of Taiwan, affected Internet access throughout Asia, and in 2008 two major cable systems were severed in the Mediterranean Sea. The cause of the severed cables is unknown, but some experts speculated that the dragging of a ship’s anchor did the job. But a review of video surveillance taken of the harbour during the outage period showed no ship traffic in the area of the severed cable. Others suggested it could have been a minor eart
hquake, causing a shift in the ocean floor, but seismic data didn’t support this conjecture. Whatever the cause, such cuts to cables are fairly routine: Even in their trenches, undersea cables are pushed to and fro by currents and constantly rub against a rough seafloor. In the case of the 2008 Mediterranean incident, the damage was severe: there were disruptions to 70 percent of Internet traffic in Egypt and 60 percent in India, and outages in Afghanistan, Bahrain, Bangladesh, Kuwait, the Maldives, Pakistan, Qatar, Saudi Arabia, and the United Arab Emirates. Nearly 2 million users were left without Internet access in the U.A.E. alone. Connections were not restored until a French submarine located the severed cables and brought them to the surface for repair.
Prior to the introduction of fibre optics, undersea cables were occasionally wiretapped by attaching instruments that collect radio frequency emitted outside the cables. During the Cold War, both the United States and the Soviet Union built special-purpose submarines that would descend on cables deep in the ocean and attach inductive coils to collect emissions. In his book Body of Secrets, historian James Bamford describes in detail Operation Ivy Bells in the early 1970s, in which the NSA deployed submarines in the Sea of Okhotsk to tap a cable connecting the Soviet Pacific Naval Fleet base in Petropavlovsk to its headquarters in Vladivostok. Specially trained divers from the USS Halibut left the submarine in frigid waters at a depth of 120 metres and wrapped tapping coil around the undersea cables at signal repeater points, where the emissions would be strongest. Tapes containing the recordings were delivered to NSA headquarters, and were found by analysts to contain extraordinarily valuable information on the Soviet Pacific Fleet. Several other submarines were later built for such missions, and deployed around the Soviet Union’s littoral coastline and next to important military bases. When fibre-optic technology (which does not emit radio frequencies outside of the cable) was gradually introduced, the utility of such risky operations diminished. However, some intelligence observers speculate that U.S. and other signals intelligence agencies have capabilities to tap undersea fibre-optic cables by cutting into them and collecting information through specifically designed splitters.
• • •
Like undersea cables, satellites illustrate the fragile nature of cyberspace. In 2009, a defunct and wayward Russian satellite collided with an Iridium low Earth orbit satellite at a speed of over 40,000 kilometres per hour. The collision caused a massive cloud of space debris that still presents a major hazard. NASA’S Earth observation unit tracks as many as 8,000 space debris objects of ten centimetres or more that pose risks to operational satellites. (There are many smaller objects that present a hazard as well.) The Kessler Syndrome, put forward by NASA scientist Donald Kessler in 1976, theorizes that there will come a time when such debris clouds will make near-Earth orbital space unusable. Although undersea fibre-optic cables provide the bulk of transit for global communications, they cannot sustain the entire load. A scenario such as the Kessler Syndrome, were it to come true, would end global cyberspace as we know it. Scientists have very few realistic solutions for cleaning up space debris.
Space is also an arena within which state intelligence agencies exercise power over the Internet. Although the Apollo missions were publicly justified on the basis of advancing human curiosity and science, the first missions into space actually had specific military and intelligence purposes. Since the 1960s, the superpowers have been developing globe-spanning satellites that are used for optical, infrared, thermal, and radar reconnaissance purposes. The Americans built a fleet of specially designed satellites whose purpose is to collect signals intelligence (sigint). Some sigint satellites operate in geostationary orbit 36,000 kilometres from the Earth’s surface, and are used to zero in on radio frequencies of everything from microwave telephone signals to pagers and walkie-talkies. Such geostationary sigint satellites deploy huge parabolic antennas that are unfolded in space once the satellite is in position, with the signals being sent to NSA listening stations located in allied countries like Australia (Pine Gap), and Germany (Bad Aibling). Because the satellites operate in deep space, and radio signals travel in a straight line, radio frequencies can be collected efficiently and with little degradation. (Other sigint satellites take unusual orbits and can reportedly hover over regions of interest for longer periods and at lower altitudes.) The NSA also operates sigint collection facilities at ground stations whose mission is to collect transmissions from civilian communications satellites. Typically, these enormous interception terminals, which look like giant angled birdbaths, are located in secure areas proximate enough to terrestrial transmission points to function properly. For example, one of the key signals intelligence stations in Canada is at the Canadian Forces Station Leitrim, just south of Ottawa, strategically positioned to intercept diplomatic communications moving in and out of the nation’s capital.
Signals intelligence gathering is highly secretive, but it is a world we should all get to know better. Originally, the objects of sigint operations were other states’ military and intelligence agencies: ballistic missile-test telemetry or operational instructions sent by high-ranking Politburo members. As the Cold War came to a close, however, this bipolar conflict atomized into a multitude of national security threats, some of which emanate from transnational terrorist groups and organized crime, and the scope of sigint operations became much broader and more widely dispersed across global civil society. As the volume of data flowing through global networks is exploding in all directions, and the tools to undertake signals intelligence have become more refined, cheaper, and easier to use, the application to cyberspace is obvious.
• • •
Although cyberspace is often experienced as an ethereal world separate from physical reality, it is supported by a very real infrastructure, a tangible network of code, applications, wires, and radio waves. Behind every tweet, chat message, or Facebook update, there is also a complex labyrinth of machinery, cables and pipes buried in trenches deep beneath the ocean, and thousands of orbiting satellites, some the size of school buses. In addition to being complex and fragile, this physical infrastructure contains a growing number of filters and chokepoints. Pulling back its layers is like pulling back curtains into dark hallways and hidden recesses, which, it turns out, are also objects of intense political contests.
There is another component of cyberspace, separate from its physical infrastructure, but that is also growing in leaps and bounds and becoming a critical part of the domain: the data. Information related to each and every one of us (and everything we do) is taking on a life of its own. It, too, has become an object of geopolitical struggle. Every call we make, every text and email we send, increasingly everything we do as we go about our daily lives, is recorded as a data point, a piece of information in the ever-expanding world of “Big Data” that is insinuating itself deeper and deeper into our lives and the communications environment in which we live.
3.
Big Data: They Reap What We Sow
From August 31, 2009 to February 28, 2010, German citizen Malte Spitz had virtually every moment of his life tracked – every step he took, where he slept and shopped, flights and train trips he booked, every person he communicated with, every Internet connection he made. All of his movements and communications were cross-checked against open-source information that could be found out about him, including his Twitter, blog, and website entries. The surveillance net around him was total, and all this information was dutifully archived. In short, someone, somewhere, knew Malte Spitz better than he knew himself.
Who was behind it? Was it the Bundesamt für Verfassungsschutz, Germany’s formidable domestic intelligence agency responsible for monitoring threats to the German state? Did they plant a bug on him? Tap his phone lines? What did Spitz do to warrant such attention? Was he a criminal? A terrorist? A long-lost member of the 1970s-era Baader-Meinhof gang? None of the above.
Malte Spitz is a Green Party politician with a clean record. Deutsche Telekom, Germany’s largest cellpho
ne company, collected the data on him through his mobile phone, but it was Spitz himself (along with Germany’s leading newspaper, Die Zeit) who collated it on an interactive map. He did so to demonstrate to the public the volume of data mobile carriers routinely collect about their users. Spitz asked Deutsche Telekom to send him all of the information they had on him. After several persistent appeals and the threat of a lawsuit, the company finally complied, sending Spitz a CD containing 35,830 lines of data. “Seen individually, the pieces of data are mostly inconsequential and harmless,” wrote Die Zeit,” [but] taken together, they provide what investigators call a profile – a clear picture of a person’s habits and preferences, and indeed, of his or her life.”
• • •
On a daily basis, most of us experience a dynamic and interactive communications ecosystem that only two decades ago was the stuff of science fiction. And today, after perhaps a decade of near total immersion, it is almost impossible for most people in the West to imagine going back to a world before instant access and 24/7 connectivity, a bustling tableau of images, text, and sounds always at our fingertips. As with any such wholesale social change, we should expect unintended consequences, not all of them desirable. Past experiences with the printing press, telegraph, radio, and television tell us that new media environments shape and constrain the realm of the possible, favouring some social forces and ideas over others. The world of “big data” is no exception.
Understood by computer engineers as data sets that grow so large that they become awkward to work with and/or analyze using standard database management tools, I like to think of big data in metaphorical terms: as endless digital grains of sand on an ever-expanding beach that we produce as we act in cyberspace. Big data comes from everywhere: from space satellites used to gather climate information to lunchtime jokes on social media sites; from digital pictures and videos posted online to transaction records from grocery stores; from signals emitted by our mobile phones to information buried in the packet headers of our emails. Every day, 2.5 quintillion bytes of data are created, and 90 percent of the data in the world today was created in the past two years. According to Dave Turek, IBM’S VP of exascale computing, from the beginning of recorded time up until 2003, humans created five “exabytes” of information (an exabyte = 1,000,000,000 gigabytes). In 2011, Turek estimates we produced that same amount of information every two days. IBM predicts that in 2013, we will be producing five exabytes every ten minutes. And it only grows. To take just one example: the Square Kilometer Array (SKA) telescope complex, currently under development by a consortium of countries and set to deploy in Australia in 2024, will produce one exabyte of data every day, roughly twice the volume of daily global Internet traffic in 2012.