Book Read Free

Tubes: A Journey to the Center of the Internet

Page 9

by Andrew Blum


  At Equinix, Troyer’s job was connecting—in a social way—with people (like his former self) who manage large Internet networks and are always looking for more places to bring them, without relying on any new middlemen. It suited him, as something of an extrovert in a crowd of introverts. He’d be right at home selling television time or mutual funds, or something else similarly abstract and expensive. But occasionally he’ll switch modes from jocular salesman to net-geek, offering a soliloquy of technical protocols and operating specifications, his jaw clenching with the effort of precision. Even the most social of networking guys know the geek that lies within. No doubt he isn’t alone at Equinix’s Silicon Valley offices, which he commutes to from his home in San Francisco. For a period of time after Adelson left the company, he commuted to his job at Digg from New York, sharing a crash pad with Troyer in the Mission District. The Internet is a small world.

  And—or so it seemed that morning in Ashburn—a secure one. Getting inside Ashburn required an elaborate identification process. Morgan, the director of operations, had previously registered a “ticket” for my visit in his system, which the guards behind bulletproof glass then checked against my driver’s license. Morgan then punched a code into a keypad beside a metal door and placed his hand on a biometric scanner, which looked like an air dryer in an airport bathroom. The scanner confirmed his hand belonged to him, and the electronic lock clicked open. The three of us shuffled into an elevator-sized vestibule—affectionately called the “man-trap”—and the door locked behind us. This was a favorite feature of Adelson’s, dating back to the initial Equinix vision. “If I’m going to close that deal with a Japanese telco, and I need to impress them, I need to be able to take a tour group of twenty people through this building,” was how he explained it. And it better look “cyberrific,” to use Adelson’s favorite term. The man-trap wasn’t only to control access into and out of the building but also (it seemed) to induce a moment of frisson. For a few long seconds the three of us looked up at the surveillance camera mounted in a high corner, flashing tight-lipped smiles to the hidden guards. I took a moment to admire the walls, hung with illuminated wavy blue glass panels made by an artist in Australia. All the early Equinix data centers have the same ones. Then, after that long, dramatic pause, the airlock doors opened with a hearty click and a whoosh, and we were released into the inner lobby.

  It too was cyberrific. Its high ceiling was painted black, like a theater, and disappeared in darkness. Spotlights left small pools of illumination on the floor. “It’s a bit like Vegas,” Troyer said, “no day or night.” Inside there was a kitchen, snack machines, a bank of arcade-style video games, and a long counter, like at an airport business center, with power and Ethernet plugs where engineers could set up shop for the day. The stools were nearly all occupied. Many customers ship their equipment ahead and pay for Equinix’s “smart hands” service to have it “racked and stacked,” as they say. But then there are those guys affectionately known as “server huggers,” who either by choice or necessity spend their days here. “They’re locals, like Norm in Cheers, pulling up his bar stool,” Troyer said, nodding in the direction of a large guy in jeans and a black T-shirt, hunched over his laptop. “But this is not a resort destination.” Beside the kitchen area was a glass-enclosed conference room with Aeron chairs and red speakerphone buttons embedded in the table. A group of half a dozen men and women in business suits had spread out files and laptops on the boardroom table, hard at work auditing the building’s systems for a customer, likely a bank. Beside the conference room was a curved, fire-engine red wall. They call it “the silo,” a name that evokes ICBMs more than grain. It was Equinix’s trademark architectural feature—its golden arches.

  Adelson loved that idea: that an engineer responsible for a global network would feel at home in Equinix facilities everywhere. There are about one hundred Equinix locations around the world and all of them carefully adhere to brand standards, the better to be easily navigable by those nomads in endless global pursuit of their bits. Ostensibly, Equinix rents space to house machines, not people; but Adelson’s strikingly humanist insight was that the people still matter more. An Equinix building is designed for machines, but the customer is a person, and a particular kind of person at that. Accordingly, an Equinix data center is designed to look the way a data center should look, only more so: like something out of The Matrix. “If you brought a sophisticated customer into the data center and they saw how clean and pretty the place looked—and slick and cyberrific and awesome—it closed deals,” said Adelson.

  Troyer, Morgan, and I passed through a gate in a steel mesh wall, and it was like stepping into a machine, all rush and whir. Data centers are kept cold to compensate for the incredible heat emitted by the equipment that fills them. And they’re noisy, as the sound of the fans used to push around the cold air combines into a single deafening roar, as loud as a rushing highway. We faced a long aisle lined with darkened steel mesh cages, each with a handprint scanner by the door—similar to the PAIX, but far more theatrical. The blue spotlights created a repeating pattern of soft glowing orbs. Everyone at Equinix confesses to aiming for some visual drama, but they’re also quick to point out that the lighting scheme has a functional purpose as well: the mesh cages allow air to circulate more freely than in small enclosed rooms, while the dim spot lighting assures a level of privacy—preventing competitors from getting a close look at what equipment you’re running.

  The Equinix buildings in Ashburn (and everywhere really, but especially here) aren’t filled with rows and rows of servers, themselves filled with enormous hard drives storing web pages and videos. They’re mostly occupied by networking equipment: machines in the exclusive business of negotiating with other machines. A company like Facebook, eBay, or a large bank will have its own big storage data center—perhaps renting space next door, inside those DuPont Fabros aircraft carriers, or in a building all its own hundreds of miles away, where electric power is cheap, and there’s enough fiber in the ground to keep the company connected. Then a company will “tether in” here, running a fiber-optic connection to this distribution depot, and spray its data out from a single cage. (This is exactly what Facebook does—in multiple Equinix facilities around the world, including Palo Alto.) The heavy-duty storage happens in the boonies, in the warehouse, while the wheeling and dealing—the actual exchange of bits—happens here, in the Internet’s version of a city, deep in our version of the suburbs, where hundreds of networks have their offices (or cages) cheek by jowl.

  I could see the physical embodiment of all those connections above us, where rivers of cable obscured the ceiling. When two customers want to connect to each other, they’ll request a “cross-connect,” and an Equinix technician will climb a ladder and unspool a yellow fiber-optic cable from one cage to the other. With that connection in place, the two networks will have eliminated a “hop” between them, making the passage of data between them cheaper and more efficient. For the Equinix technicians, laying cables is something of an art form, with each individual type placed at a certain layer, like a data center mille-feuille. Closest to our heads were yellow plastic raceways, the size and shape of rain gutters, typically made by a company named ADC. They come in an erector-set system of straightaways, “downspouts,” and connectors, sold at varying widths, depending on how many cables you need to run. The “4 x 6 System,” for example, can hold up to 120 3 mm yellow patch cables, while the “4 x 12 System” can manage 2,400 of them. Equinix buys the raceways in such quantities that the company occasionally requests custom colors—clear plastic, or red, in contrast with the stock yellow. The oldest cables are on the bottom of the pile. “It’s almost like an ice core,” Troyer said. “As you dig down you’re going to see sediment from certain time periods.” Given the monthly fee charged for each “cross-connect,” this is the bread and butter of Equinix’s business. The bean counters see each one as monthly recurring revenue. The network engineers see vectors. The data center techs s
ee the sore back they’ll get reaching up to the ladder racks to run the cables. But in the most tangible way possible, these cables are the Inter in Internet: the space in between.

  One layer nearer the ceiling, above the yellow fiber, is the “whalebone,” a more open style of cable organizer that indeed looks like the rib cage of a large seagoing mammal. It holds the copper data cables that are physically thicker, stronger, and cheaper than the yellow fiber-optic ones but can carry much less data. Above that is stainless-steel racking for AC power; then a thick black metal frame for DC power; then thick green electrical grounding cables, each layer visible above the other like branches in a forest. Finally, up top near the black ceiling is the “innerduct”: ridged plastic tubing through which run the thick fiber-optic ribbons operated by the telecom carriers themselves. This was where Verizon, Level 3, or Sprint would have its cables. Unlike the yellow patch cords that each contain one strand of fiber, the innerduct might have as many as 864 fibers, tightly bonded together to save space. This is Ashburn’s spinal cord—the stuff that Adelson fought to bring into the building in the first place—and it accordingly occupies the safest place, out of harm’s way near the ceiling. “That stuff’s important for us,” Troyer said. “Bundled fiber coming in is what gives us our value.”

  We followed the path of the innerduct to the center of the building, an area known as “Carrier Row.” Concentrating the big guys in the middle is practical: it limits the likelihood of having to make a six-hundred-foot run from one corner of the building to the other. But it’s also symbolic: these are the popular kids standing in the center of the party, with everybody on the edge craning to see.

  We came to a cage with its lights on. Troyer is professionally discreet about what companies have equipment here, but he happily talked through the anatomy of a typical installation. In the near corner of the parking-spot-sized space was the “DMARC,” short for “demarcation point,” an old telecom term to describe the place where the phone company–owned equipment ended and the customer’s begins. It worked the same way here. The heavy-duty metal and plastic rack, the size of a circuit breaker in a large house, was the physical switchboard where Equinix handed off communications cables to the customers. It was the industrial-sized equivalent of a telephone jack: a passive, or “dumb,” device, a solid object whose job was to keep the cables neat and organized and make it easy to plug them in. From the DMARC, the cables then ran through overhead trays to the main bank of racks.

  Data center racks are always 19 inches wide—a dimension so standardized that it’s a unit in and of itself: a “rack unit” or “RU” measures 19 inches wide by 1.75 inches tall. The heart of the operation here was a pair of Juniper T640 routers, clothes-dryer-sized machines designed to send massive quantities of data packets off toward their destinations. These two were likely set up so that if one failed, the other would immediately step in to pick up the slack. Troyer counted the 10-gigabit ports on one of them, each with a blinking green light and a yellow cable sprouting from it. There were seventeen of them. Working together, they could move a maximum of 170 gigabits per second of data—the kind of traffic a regional cable company like Cablevision might put up, serving the aggregate needs of its three million customers. Serious computational power was required to make the innumerable logical decisions to send so many pieces of data out the correct door, having checked them against an internal list of possibilities. That power in turn generated serious heat, which in turn required hair-flattening fan levels to keep the machine from cooking itself. The machines roared with the effort. We all squinted our eyes as the hot air blew at us through the mesh cage wall.

  Beside the big routers was a rack holding a couple single-RU servers. These were too small to be doing any serious work actually “serving” web pages or videos. Most likely they were merely running software to monitor this network’s traffic—like robotic techs in lab coats, scratching notes on a clipboard. Below those servers was some “out of band” equipment, meaning it was connected to the rest of the world via an entirely separate path than the main routers—perhaps even by an old telephone modem, or occasionally through a mobile data connection, like a cell phone, or both. This was the fail-safe. If something went horribly wrong with the Internet (or more likely just this piece of it), this network’s minders could telephone into the big Junipers to fix it, or at least try. You never want to rely on your own broken lines. They always had another option, though: the bulky power strip on the floor, which sprouted not only big electrical cords but also an Ethernet cable connecting it back into the network. Just as you might unplug and plug back in your connection at home to get it working, this did that, but remotely. An engineer could turn the power on and off at a distance—never a bad troubleshooting tip, even for these half-million-dollar boxes. But it doesn’t always work. Sometimes the techs had to show up in the flesh to yank the plug.

  Equinix Ashburn sees upward of twelve hundred visitors a week, but I wouldn’t have guessed that walking around. The size of the place, the twenty-four-hour clock, and the nocturnal predilections of network engineers made it feel empty. As we walked the long aisles, we’d occasionally see a guy sitting cross-legged on the floor, his open laptop plugged into one of the giant machines—or perhaps sitting in a broken task chair, its backrest askew. It was uncomfortably noisy, the air was cold and dry, the perpetual half-light disorienting. And if a guy is on the floor in the first place, it’s likely because something has gone wrong—a route is “flapping,” a networking card has “fried,” or some other mishap has befallen his network. He is mentally wrestling with the complicated equipment and physically uncomfortable. As we passed one tired-looking guy sitting in a small pool of light, like a troll, Troyer shook his head in sympathy. “Another poor schlep sitting on the floor.” He yelled down the aisle: “I feel your pain!”

  We passed the narrow room that stores the batteries that instantly provide backup power if the utility lines should fail. They were stacked up to the ceiling on both sides like drawers in a morgue. And we passed the generator rooms that take over from the batteries within seconds. Inside were six enormous yellow dynamos, each the size of a short school bus, each capable of generating two megawatts of power (creating the ten megawatts this one building needs at full capacity, with two extra for good luck). And we passed the 600-ton chillers used to keep the place cool: a giant steel insect of looping pipes each the diameter of a large pizza. For all the high-tech machines and uncountable quantities of bits, Equinix’s first priority is to keep the power flowing and the temperature down: it’s the company’s machines that keep the other machines going.

  Most of what I’d seen so far could have been in a data center anywhere. The equipment had arrived in crates with wooden skids, in cardboard boxes with CISCO stamped on the side, or on the back of a massive tractor-trailer, WIDE LOAD emblazoned on its bumper. But finally we came to a room where that wasn’t true, and which I was excited to see above all else. Inside, the expanse of the planet—and the particularity of this place on it—was more explicit. The plastic name plaque on the door said FIBER VAULT 1. Morgan unlocked it with a key (no hand scanner here) and flipped on the lights. The small space was quiet and hot. It had white walls and linoleum floors, marked with a few scratches of Virginia clay. In the middle of the room was a wide-gapped steel frame, like three ladders fitted together side by side. Heavy-duty plastic tubes stuck out of the floor and rose to waist height, a half dozen on both sides of the rack, open on the top, each wide enough to fit your whole arm in. Some of these tubes were empty. Others erupted with a thick black cable, perhaps a fifth as wide as the tube. Each cable was marked with the telecommunications carrier that owned it, or used to, before it was bought or went bust: Verizon, MFN, Centurylink. The cables were attached to the wide frame in neat coils, and then looped up to the ceiling where each reached the highest level of ladder racking—the carrier innerduct. This was where the Internet met the earth.

  There are different kinds of connectio
n. There are the connections between people, the million kinds of love. There are the connections between computers, expressed in algorithms and protocols. But this was the Internet’s connection to the earth, the seam between the global brain and the geologic crust. What thrilled me about this room was how legible it made that idea. We are always somewhere on the planet, but we rarely feel that location in a profound way. That’s why we climb mountains or walk across bridges: for the temporary surety of being at a specific place on the map. But this place happened to be hidden. You could hardly capture it in a photograph, unless you like pictures of closets. Yet among the landscapes of the Internet, it was the confluence of mighty rivers, the entrance to a grand harbor. But there was no lighthouse or marker. It was all underground, still and dark—although made of light.

 

‹ Prev