Book Read Free

Tubes: A Journey to the Center of the Internet

Page 17

by Andrew Blum


  Across the Thames was the white moonscape roof of the Millennium Dome, sitting precisely on the prime meridian—like a cosmic affirmation of its importance. The East India station itself is a hundred yards short of the Eastern Hemisphere. The big Internet buildings line up along an empty plaza, looking like a showroom of giant chef’s stoves, each steelier than the next. They have no signs, which is a shame given that their occupants’ names are something Ballard might have invented: Global Crossing, Global Switch, Telehouse. I saw no pedestrians and little traffic, only the occasional white van marked with a telecom company logo, or a red double-decker bus idling at its terminus. The streets themselves take their names from the spices that filled the East India Company docks that once stood here: Nutmeg Lane, Rosemary Drive, Coriander Avenue. But the only tangible remnant of that past was a leftover piece of the brick wall that surrounded the docks. Beside a man-made pond overhung with weeping willows a bronze statue of two angelic figures struck a vacantly hopeful note—a Winged Victory of the data center.

  The neighborhood was born as a network hub in 1990, when a consortium of Japanese banks opened Telehouse, a steely slab tower specially designed for their mainframe computers. An aerial photograph from the time shows it sprouting from a wasteland, with only the Financial Times Print Works next door (since turned into an Internet building). The banks came partly because the greater Docklands’ status as an enterprise zone brought significant financial incentives, meant to spur its redevelopment following the departure of shipping to the deepwater ports down the Thames. But they really came for a more familiar reason: the site sat on top of London’s main communications trunk routes, the fiber running like an underwater river beneath the A13 motorway. What was true in New York was true here as well: people go to where things are. And Telehouse was just getting going.

  Almost as soon as the building was completed, the City was rocked by a series of terrorist bombings at the hands of the Irish Republican Army, leading the banks to scramble to install backup desks for “disaster recovery.” Telehouse soon filled with empty trading halls, each desk a mirror to one in the City. Those first robust pieces of telecom infrastructure seeded the building for what came next: first, the deregulation of the British telecommunications system. And then the Internet. Being outside the influence of British Telecom made Telehouse the ideal place for the new competitive telephone companies to physically connect their networks. All those phone lines attracted one of the first British Internet providers, Pipex, which located its “modem pool” there: a few dozen book-sized boxes, bolted to a plywood frame and each linking a single telephone line into a shared data connection. Pipex was taking advantage of the building’s telecommunications infrastructure coming and going, funneling the local phone lines into the international data links—which meant, at the time, a transatlantic circuit back to MAE-East. From that, the familiar recombinant growth of the physical Internet kicked in. The informal decisions of a handful of network engineers to build on its infrastructure had a profound impact on the future shape of the Internet.

  Telehouse’s position as the center got its semiofficial sanction in 1994, when the London Internet Exchange, or LINX, was established there, using a hub donated by Pipex, and installed next to its existing modem pool. At the time, a network could join the exchange only if it had “out-of-country” connectivity—which in practice meant its own link to the United States. The rule was famously snooty, enough to inspire the legendary quip that the LINX was run like a “gentleman’s country club.” But it did have an important, if unintended, consequence: the larger Internet providers began using Telehouse to resell their international connections to smaller providers. If you weren’t a big enough network to cross the Atlantic yourself (and therefore be allowed to exchange traffic across the exchange), then you could at least connect to someone who did by renting a rack in Telehouse and installing your equipment. The last step was the most physical. “You could come into Telehouse and get the connection by dragging a piece of fiber across the floor,” Nigel Titley, one of the LINX’s founders, recalled to me. When before the end of 1994, BTNet—British Telecom’s upstart Internet service—leased a two-megabit line across London and installed a router right next to Pipex’s, it was clear that Telehouse had fully arrived. When new transatlantic fiber-optic cables began going in the water a few years later, it was obvious where they would terminate. The Eastern Hemisphere had a new center of the Internet. At Telehouse it all came together, an infinite mesh of small phone companies, commodities traders, pornographers, trading platforms, and website hosts, congealed into a global brain, with almost as many neurons.

  Today everyone connected to the Internetworking community in London has a machine in Telehouse—and therefore a key. Almost every network engineer I contacted in London offered to show me inside. A gate slid open to allow in cars, but I entered through a full-body turnstile, unlocked by a guard watching closely from inside a booth. Telehouse had grown into a multibuilding compound surrounded by a high steel fence. Security was intense. In 2007, Scotland Yard broke up an al-Qaeda plot to destroy the building, from the inside. Judging by the evidence gleaned from a series of hard drives seized in a raid on Islamic radicals, they were found to have conducted intense surveillance on Telehouse, as well as a complex of gas terminals on the North Sea. “Major colocation companies such as Telehouse are strategically important organizations at the heart of the internet,” Telehouse’s technical services director told the Times of London.

  I crossed a narrow parking lot to a gleaming two-story reception hall, with glass walls on three sides and large ficus plants in the corners. I was met there by Colin Silcock, a young network engineer at the London Internet Exchange, who had offered to show me one of its cores—the descendant of Pipex’s original box. We stepped into a pair of side-by-side glass tubes, each barely wide enough for a single person, with front and back doors that rotated open like a bank’s security drum, and a wobbly rubber floor that floated free of the side walls—a scale that weighed you entering and exiting the building, to make sure you weren’t leaving with any heavy (and expensive) equipment. As we stood there trapped for a long silent moment, waiting for the unseen computer to finish verifying our respective mass and identity, Silcock shot me a surprised look through the rounded glass. I had let out a burst of uncontrolled laughter, a loud snorting guffaw. I couldn’t help it: I was inside a tube!

  But as we went deeper into the building, Telehouse’s high-tech bells and whistles dropped off, and a creakier reality came into focus. Where the neighborhood outside felt utterly controlled—scrubbed, anti-Dickensian—inside Telehouse the atmosphere tended to be more wayward. The original Telehouse building, now known as Telehouse North, had in the years since been joined by two new ones, each larger and more sophisticated than the first: Telehouse East, which opened in 1999, and Telehouse West, which opened in 2010. The three read like a short history of the Internet’s architecture. The original owed a debt to the high-tech style made famous with the Pompidou Center. It had steel sun shades and a machinelike character. Inside, it was decidedly worn out, with shredded gray carpet, yellowing white walls, and huge bundles of unused copper cables flooding out of broken ceiling panels. The middle-aged building was neat and spare—inside, a study in linoleum. The youngest had a windowless façade patterned with steel panels, like pixels. It smelled of paint and spearmint, its rooms trafficked by technicians rolling dollies piled with brand-new equipment. A radiating system of catwalks and stairwells connected the buildings. It reminded me of a hospital, with heavy fire doors and archaeological layers of signage and building hardware scarring the walls. But instead of doctors and nurses, there were network technicians, nearly all men, studiously groomed with short cropped hair and goatees, looking like they might be about to leave for a nightclub, or perhaps had come straight from one. The parking lot exhibited their taste for tricked-out cars, and they carried bulky and unusual smartphones and large laptop backpacks. Nearly universally, they wore bl
ack T-shirts and zip-up hooded sweatshirts, handy for spending long hours on the hard floor of the server rooms, facing the dry exhaust blast of an enormous router.

  As if going back in time, Silcock and I entered Telehouse North via a pedestrian bridge with an exposed sheet metal ceiling and scummy windows overlooking the car park. We followed the path of a ladder rack filled with purple cables—the only splash of color in the pallid environment. The hallway was littered with the cardboard box shibboleths and “caution” sawhorses laid across broken floor tiles. A guard sat in a straight-backed chair reading a paperback spy novel. Through a door window I saw empty desks, a remaining vestige of this building’s role as a disaster recovery space. But most rooms were filled with aisles and aisles of high racks, stuffed with the same variety of equipment I saw in Palo Alto and Ashburn, Frankfurt and Amsterdam. In the corners were huge bundles of wires spilling from the ceiling, sinewy and broad like jungle tree trunks. Much of it was decommissioned; a popular joke has it that there’s a fortune at Telehouse in copper mining. The streets outside were a rare moment when London felt empty, staid and binary; but this virtual world within seemed tossed and chaotic. It was a surprisingly shoddy piece of Internet. I understood what one engineer had meant when he described Telehouse North as “the Heathrow of Internet buildings.” But a fantastically important one. The building’s status as one of the most connected buildings on the planet forgave the broken pieces of flooring. At this point, it is what it is—and almost impossible to change. It would be like complaining that the streets of London were too narrow.

  We finally arrived at the London Internet Exchange’s hotel-room-sized space, cluttered and homey, crowded with the detritus of the long hours the engineers spend in there. Blue Ethernet cables hung like neckties on a series of hooks, alongside the coats. Silcock gave me a little tour, identifying the different pieces of equipment and filling in some of the history of the exchange. It was getting to be lunchtime, I was hungry, and I almost left without noticing it. But tucked away at the end of a narrow aisle of equipment, blinking innocently away, was another of those refrigerator-sized machines: a Brocade MLX-32, from a mirror-walled building in San Jose, California. Silcock propped his laptop on a chest of tools and looked up its live traffic numbers. Moving across the switch at that instant were three hundred gigabits of data per second, out of a total of eight hundred gigabits across the London Internet Exchange as a whole. Deep in the heart of Telehouse I could hear Par Westesson’s voice in my ear, as clear as if he were on the telephone. A gig is a billion, he said. A billion bits made of light.

  6

  * * *

  The Longest Tubes

  The underwater telecommunications cable known as SAT-3 sweeps down the Atlantic coast of Africa from the western edge of Europe, linking Lisbon, Portugal, to Cape Town, South Africa, with stops along the way in Dakar, Accra, Lagos, and other West African cities. When it was completed in 2001, it became the most important link for South Africa’s five million Internet users, but a horribly insufficient one. SAT-3 was a relatively low-capacity cable with only four strands of fiber, while the biggest long-distance undersea cables might have up to sixteen. Worse, its meager capabilities were further reduced by the needs of the eight countries SAT-3 connected before arriving in Cape Town. South Africa was the bandwidth equivalent of an attic shower. The country faced a widely discussed “bandwidth crisis,” with low usage caps and exorbitantly high prices.

  This troubled Andrew Alston more than anyone. As chief technology officer of TENET, South Africa’s university research network, Alston had been a slave to SAT-3 since its completion, purchasing ever-larger amounts of bandwidth to serve the growing needs of the entire academic system. By 2009, Alston was paying nearly $6 million a year for a 250-megabit connection.

  Then a new cable, SEACOM, arrived. It ran up the eastern coast of Africa, stopping in Kenya, Madagascar, Mozambique, and Tanzania, before branching to Mumbai and through the Suez toward Marseille. Alston signed on as a charter customer with a ten-gigabit connection—forty times the bandwidth he had on SAT-3, at the same price. But the “circuit” had very specific geographic terms: it linked the cable’s landing point in the coastal village of Mtunzini, ninety miles from Durban, straight to Telehouse in London, where TENET had existing connections to more than a hundred other networks. That left Alston to complete the final link between Mtunzini and Durban, where his nearest router lived. Cabling everything up and configuring the fiber-optic equipment took forty straight hours. At the end of it, he was sitting cross-legged and a little delirious on the floor beside his equipment when a light indicated that the connection was active—the whole ten thousand miles to London. “This was probably 4:30-odd in the afternoon, and—pujah!—I could see both ends of the link,” he recalled. He tried to run a few tests but quickly maxed out his equipment; the capacity far exceeded what his computer could artificially generate. He had his finger on a gusher.

  He told me this story over the phone from his office in Durban, while I sat in my own office in Brooklyn. The phone line was crisp, the two hemispheres and fifteen thousand or so miles of cable between us amounting to only the slightest discernible delay. But I was aware enough of the distance to be even more shocked by the bald physicality of what he had described. We all deal constantly with the abstraction of an Internet connection that’s “fast” or “slow.” But for Alston the acceleration came with the arrival of an unfathomably long and skinny thing, a singular path across the bottom of the sea. Undersea cables are the ultimate totems of our physical connections. If the Internet is a global phenomenon, it’s because there are tubes underneath the ocean. They are the fundamental medium of the global village.

  The fiber-optic technology is fantastically complex and dependent on the latest materials and computing technology. Yet the basic principle of the cables is shockingly simple: light goes in on one shore of the ocean and comes out on the other. Undersea cables are straightforward containers for light, as a subway tunnel is for trains. At each end of the cable is a landing station, around the size of a large house, often tucked away inconspicuously in a quiet seaside neighborhood. It’s a lighthouse; its fundamental purpose is to illuminate the fiber-optic strands. To make the light travel enormous distances, thousands of volts of electricity are sent through the cable’s copper sleeve to power repeaters, each the size and roughly the shape of a bluefin tuna. One rests on the ocean floor every fifty or so miles. Inside its pressurized case is a miniature racetrack of the element erbium, which, when energized, gooses along the photons, like a waterwheel.

  It all struck me as wonderfully poetic, an ultimate enjoining of the unfathomable mysteries of the digital world with the even more unfathomable mysteries of the oceans. But with a funhouse twist: for all the expanse these cables spanned, they were skinny little buggers. There wasn’t all that much to them. The cables spanned oceans and then landed at incredibly specific points, tying up to a concrete foundation inside a manhole near the beach—a far more human-scaled construction. I imagined them like elevators to the moon, diaphanous threads disappearing to infinity. In their continental scale, they invoked The Great Gatsby’s image of an expanse “commensurate to [man’s] capacity for wonder.” Our encounters with this kind of geography typically come with more familiar images, like a ribbon of interstate, a length of train track, or a 747 parked expectantly at the airport gate. But undersea cables are invisible. They feel more like rivers than paths, containing a continuous flow of energy rather than the occasional passing conveyance. If the first step in visiting the Internet was to imagine it, then undersea cables always struck me as its most magical places. And only more so when I realized their paths were often ancient. With few exceptions, undersea cables land in or near classic port cities, places like Lisbon, Marseille, Hong Kong, Singapore, New York, Alexandria, Mumbai, Cyprus, or Mombasa. On a daily basis it may feel as if the Internet has changed our sense of the world; but undersea cables showed how that new geography was traced entirely upon
the outlines of the old.

  For all that magic, my journey to see the cables began in an office park in southern New Jersey. The building was true Internetland—unmarked, shiny, near the edge of the highway, with apparently no one around except the FedEx guy. It belonged to Tata Communications, the telecom wing of the Indian industrial conglomerate, which in recent years had made a strong push to be a major competitor among the Internet’s global backbones. In 2004, Tata paid $130 million for the Tyco Global Network, which included almost forty thousand miles of fiber-optic cable spanning three continents, including major undersea links across both the Atlantic and Pacific Oceans. The system was a beast. Tyco was best known for manufacturing cables, not owning them, but as part of the corporate largesse-turned-malfeasance under CEO Dennis Kozlowski—who was convicted of grand larceny and securities fraud and sent to prison in 2005—Tyco spent more than $2 billion building a global network of its own, at an unprecedented scale. The piece of the network known as TGN-Pacific, for example, consisted of a fourteen-thousand-mile loop from Los Angeles to Japan and back to Oregon—two full crossings of the mighty Pacific. Finished in 2002, it had eight fiber pairs, double the number of its competitors. From an engineering standpoint, the Tyco Global Network—rechristened the Tata Global Network—was grand and beautiful. But financially the project was an unmitigated disaster, perfectly timed for the 2003 low point of the technology industry. As the Englishmen who dominate the undersea cable industry liked to say, the capacity they’re selling is too often “cheap as chips.”

 

‹ Prev