Tubes: A Journey to the Center of the Internet
Page 14
I hadn’t realized the extent of that growth until I stumbled across an online map, a homemade Google mash-up made by someone named only as “Jan,” which indicated with colored pushpins, as if they were coffee shops, the location of nearly a hundred data centers in the Netherlands. A green pin meant an AMS-IX location, red showed private carrier-owned buildings, and blue indicated a data center that had fallen out of use. If I zoomed out to the scale of the country as a whole, the pins blanketed the screen, all leaning in the same direction, like windmills. It struck me as a startling example of the Netherlands’s transparency: here, pulled together in one place on the open web, was the same information that WikiLeaks had deemed sensitive enough to bother leaking. And yet nobody seemed to care. The map had been there for two years already, apparently unmolested.
It also clarified a broader point that I’d been circling for months. It showed the Internet’s small-scale geography, with the data centers clustered into defined Internet neighborhoods, like the industrial parks surrounding Schiphol Airport, the “Zuidoost” just southeast of the city center, and the academic area known as Science Park Amsterdam. A similar map for the area around Ashburn or of Silicon Valley would certainly show as many individual locations. But compared with those American-scaled suburban expanses the Netherlands was remarkably compact. It raised the possibility of a novel way of seeing the Internet, of gleaning its sense of place: a data center walking tour.
I was beginning to realize that I’d spent cumulative weeks behind electronically locked doors, engaged in long conversations with the people who made the Internet work. But these interactions had all been planned, considered, and tape-recorded. Permission had been granted by corporate higher-ups. Badges had been issued, visitor logs signed. But it often felt like I was wearing blinders. I had been so focused on the trees I had almost forgotten about the forest. I was always rushing through parking lots, diving headlong toward the “center.” In nearly every case, the format of the day limited lingering. I had little time for any idle contemplation.
In Amsterdam, I was due to meet Witteman and visit the core switch at the heart of the Amsterdam Internet Exchange—the Dutch version of what I’d seen in Frankfurt. But the data center map seemed the perfect excuse for a more open-ended way of seeing the Internet, something more like a sojourn than another guided tour. The challenge was that the Internet was a difficult place to just show up. Data centers and exchange points don’t have visitor centers, in the way of a famous dam or the Eiffel Tower. But in Amsterdam the Internet was so thick on the ground—there was so much of it—that even if I couldn’t knock on doors and expect to be invited in, I could at least stroll by a couple dozen Internet buildings in a single afternoon, a piling up that would answer the question of what the Internet looked like in a new way. Architecture expresses ideas, even when architects aren’t involved. What was the physical infrastructure of Amsterdam’s Internet saying?
There’s a wonderful essay by the artist Robert Smithson called “A Tour of the Monuments of Passaic.” In his warped mind, the industrial wastelands of Passaic, New Jersey, become as evocative as Rome, with every inch worthy of aesthetic attention. But Smithson’s insistence at playing the straight man ends up making the whole account wildly surreal. The New Jersey swamps become a place of wonder. “Outside the bus window a Howard Johnson’s Motor Lodge flew by—a symphony in orange and blue,” Smithson writes. The big industrial machines, quiet on that Saturday, are “prehistoric creatures trapped in the mud, or, better, extinct machines—mechanical dinosaurs stripped of their skin.” His point is that there is value in noticing what we normally ignore, that there can be a kind of artistry in the found landscape, and its unconventional beauty can tell us something important about ourselves. I had the hunch this approach would work with the Internet, and with the help of that Dutch data center map I could set out to confirm it.
I enlisted a fellow wanderer—a professional Internet watcher, albeit of a more conventional stripe. Martin Brown worked for Renesys, the Internet routing table analysts, and he had recently moved to ’s-Hertogenbosch, the small Dutch city known by most as Den Bosch, or “The Forest” (indeed!), where his wife had a new job with a pharmaceutical company. Brown, a former programmer and now a full-time watcher of the Internet’s routing table, is an expert in the Internet’s inner workings. In particular, I’d admired his study of Cogent and Sprint’s “de-peering” event. Yet Brown said that while he’d been inside a handful of data centers and exchange points, he’d never stopped to really look at them—certainly not in this way at least. We made plans to meet for an urban hike, an eight-mile or so journey beginning at a subway stop a few minutes’ ride from the city center and ending substantially farther out in the near suburbs.
Our first data center was visible from the elevated train platform: a menacing concrete bunker the size of a small office building, with worn-out blue window trim, spreading out along a canal connecting to the Amstel River. The late-winter day was gray and damp, and there were houseboats tied up at the edge of the still water. My map indicated that the building belonged to Verizon, but a sign on the door said MFS—the vestigial initials of Metropolitan Fiber Systems, the company that Steve Feldman worked for and that ran MAE-East, and which Verizon had acquired years before. There was clearly no rush to keep up appearances; it seemed, rather, that its new owners preferred the building to disappear. When Brown strolled up to the front door and peered into the darkened windows of the lobby, I nearly shouted at him, like a kid about to enter a haunted house. I admit I was skittish. The security barriers, frosted glass, and surveillance cameras made the point that this was not a place that welcomed outside interest, much less any visitors, and I was eager to avoid having to explain to anyone what it was exactly we were doing (much less what I had been doing all along), regardless of whatever map this place was on. The building said “back off.”
We doubled back across the canal toward the data center that contained one of the AMS-IX cores. It was owned by euNetworks, which, like Equinix in Ashburn, is fundamentally in the business of renting space, and the building had a sign by the door and the friendly receptionist was visible through the lobby’s glass walls. But when Brown and I walked around the back, its true starkness became clear: a block-long blank wall of gray brick at the base and corrugated steel above. Adding to the feeling of mystery was the rusting hulk of an old Citroën truck parked on the otherwise empty street, looking like a prop from Mad Max. We circled it, greedily taking pictures. (Smithson: “I took a few listless, entropic snapshots of that lustrous monument.”) It had become an evocative scene: the lonely block, the too-blank brick and steel of the data center, the line of surveillance cameras, and—above all—the knowledge of what was happening inside those walls. This wasn’t any old blank building but among the most important Internet buildings in the world. And this was getting fun.
We marched onward along a wide bike path, dodging middle schoolers pedaling home from soccer practice, and scampered across a couple wide intersections. After one false ID of an office building (too much glass to be a data center), we looped around an extra block to see an unmarked steel shed that seemed surprisingly sturdy. There were again no signs, but the map told me it belonged to Global Crossing, the big international backbone owner that has since been purchased by Level 3. Earlier in the week I’d been inside Global’s facility in Frankfurt, and I could see the family resemblance. They were built at the same time, at the direction of a single engineer barnstorming across the continent. This was a piece of the Internet of the highest order, a key node of what Global liked to call its “WHIP,” for “world’s heartiest IP network.” The corrugated steel and security cameras were the first clue, but more than that was the quality of the building’s construction. A plumbing supply warehouse might be the same scale and the same materials, and might even have a camera or two. But this was definitely trying hard not to look like anything—the architectural equivalent of a police detective’s generic sedan. Inter
net buildings are conspicuous in their quiet; but when you learn how to recognize them, they seem quietly conspicuous.
As we notched the miles, we learned their tells: the steel generator enclosures, the manholes out front forged with network names, the high-grade surveillance cameras. Trudging along was physically satisfying; we weren’t sniffing the air with our smartphones looking for wireless signals, but divining more tangible clues.
We crossed beneath an elevated highway into a neighborhood of narrow lanes set alongside another canal, where there was a group of a half-dozen data centers interspersed among car dealerships. The data centers were bigger. A corrugated steel building with a narrow band of windows running the length of its second floor belonged (according to the map) to Equinix. Compared with the tilt-up concrete shells in Ashburn, it may as well have been designed by Le Corbusier, with its ribbon windows and paneled façade. But by any reasonable standard it was utterly unremarkable, a place we would have walked right by if we hadn’t been walking straight toward it. We paused to admire it, and Brown sipped water from a canteen, as if we were on a mountain trail. A duck—green head, yellow beak, orange feet—waddled up beside us. We were chilled and wary. (Smithson: “I began to run out of film, and I was getting hungry.”) I was tired enough at this point—not merely from our walk, but from a whole jet-lagged week in the stale air of the Internet—that the reality hit me hard: I was traveling the world looking at corrugated steel buildings. I had learned what the Internet looked like, generally speaking: a self-storage warehouse. An unusually pretty one, though.
The next day I visited Witteman at the AMS-IX offices. On the wall behind his desk was a homemade mash-up of the movie poster from 300, based on the bloody comic book epic about the battle of Thermopylae. The original poster had read “Tonight we dine in hell,” and showed an enraged, bare-chested Spartan baring his teeth. Witteman’s version kept the soldier but Photoshopped the blood-dripping text to read, “We are the biggest!” I had a hunch who in this fantasy represented the Persians. While Frankfurt’s exchange projected a polished character, AMS-IX seemed to strive for a thoughtful informality, a philosophy that extended to its offices in a matched pair of historic town houses near the center of the old city. The young, international staff ate lunch together every day, cooked by a housekeeper and served family style at a table overlooking the back garden. There was a homeyness to AMS-IX that I hadn’t yet encountered in the Internet. Rather than the network being the realm of conspiracy theories and hidden infrastructure, the exchange embodied a spirit of transparency and individual responsibility. And as it turned out, that feeling extended to its physical infrastructure.
Before lunch, Witteman and I collected Hank Steenman, AMS-IX’s technology guru, from his office across the hall. The three of us climbed into the AMS-IX jalopy, a beat-up little minivan filled with old coffee cups, and headed toward the core switch, located in one of the data centers Brown and I had walked by. There was a bike rack outside and a welcoming, light-filled lobby, with framed network maps on the walls. We walked down a wide hallway lined with doorways painted bright yellow and past a room used by KPN, filled with racks painted their patented green color. AMS-IX had its own large cage in the back. The yellow fiber-optic cables were perfectly coiled and bound. The machine they plugged into looked familiar. Very familiar. It was a Brocade MLX-32—the same as used in Frankfurt. Alas, the Internet’s sense of place did not extend to the machinery. “So here’s the Internet!” Witteman teased. “Boxes like this. Yellow cable. Lots of blinking lights.”
That evening, when I got back to the Rembrandtplein, a busker was singing like Bob Dylan, and tourists and revelers gathered around. Couples sat smoking on benches. A stag party stormed by, kicking up a commotion. Amsterdam was so many things. But all I could think about was what would happen if you sliced a section through the streets and buildings: the broken walls would glow with the prickly sparkle of all those serrated fiber-optic cables, another kind of red light: the rawest material of the Internet—and, even more than that, of the information age.
5
* * *
Cities of Light
Back in Austin, at NANOG, I’d met a guy named Greg Hankins with the unfortunate job title of “Solutioneer.” He mixed with the peering crowd, was quick to pay for drinks, and seemed to be a member in good standing of the traveling circus of network engineers, peering coordinators, and Internet exchange operators. In particular, he was especially close with Witteman and Orlowski. But he didn’t run a network or work for an Internet exchange. Hankins was employed by Brocade, a company that made—among just a few other things—the MLX series of routers. These were machines the size of refrigerators, the cost of trucks, and utterly essential for the inner workings of the Internet. In Frankfurt and Amsterdam I’d seen Brocade’s most powerful model—the MLX-32—running full tilt. But I’d also seen it, or its less powerful versions, in nearly every other Internet building I’d been in, made by Brocade or one of its competitors, like Cisco or Force10. When I hadn’t actually spotted the big router inside a locked cage, I saw its shipping box, littering the darkened corridors of the data center like the scat of a native bear. These were the cardboard shibboleths of the physical Internet, the clearest sign that a building was on the network in a serious way. But as much as that, I liked the way that routers were the basic building blocks of the Internet. They scaled: the twenty-dollar box I bought at Radio Shack was a kind of router, and so was Leonard Kleinrock’s original IMP. They were and are the Internet’s first physical pieces.
But what did I really know about what went on inside? I’d learned about the geography of the Internet, about where it was. But I didn’t know much about what it was. At home, everything was copper: the wire coming from the backyard, the cables on my desk, the last vestigial telephone cords on the landline. But in the heart of the Internet, it was all fiber—thin glass strands filled with pulses of light. So far I’d been reassured that on the Internet there’s always a distinct physical path, whether a single yellow fiber patch cord, an ocean-spanning undersea cable, or a bundle of fibers several-hundred thick. But whatever went on inside the router was invisible to the naked eye. What was the physical path in there? And what might that tell me about how everything else connected? What was the reductio ad absurdum of the tubes?
The Internet was a human construction, its tendrils spreading around the world. How was all that stuff shoehorned into what was out there already? Did it seep under buildings or along “telephone” poles? Did it take over old abandoned warehouses or form new urban neighborhoods? I didn’t want a PhD in electrical engineering, but I hoped what was going on inside the black box and along the yellow wires could be ever so slightly, well, illuminated. Hankins was perpetually on the road and couldn’t stop. But he had a guy in San Jose who could tell me something about the power of light.
Brocade’s headquarters was in a mirror-windowed building in the shadow of the San Jose airport, in Silicon Valley. I was met in the lobby by Par Westesson, whose job is to string together Brocade’s most powerful machines to simulate the largest Internet exchange points—and then to break them and figure out a way to make them better. “We’ll pull out a fiber or power down one of the routers while traffic is flowing through,” Westesson said. “That’s a typical day for me.” I got the impression he didn’t like when things didn’t work. Born in Sweden, he wore a neatly pressed checked tan shirt and brown chinos, and his blue eyes were dulled from time spent under fluorescent lights and amid the dry air of the laboratory upstairs. It was a room the size of a convenience store, busy with technicians standing in twos and threes in front of double-screened displays, or rustling through bins of fiber-optic cables and spare parts. The blinds were drawn against the sun. Westesson invited me to treat the place like a petting zoo.
I could take apart one of these machines—with no risk of harming the live Internet. The biggest and dumbest of a router’s four basic parts is the “chassis,” the file-cabinet-like enclosure
that gives the machine its grossest physical structure, like the chassis of a car. Slightly smaller and smarter is the “backplane,” which in an MLX-32 is a steel plate bigger than a pizza, etched with copper traces like a garden labyrinth. Fundamentally a router’s job is to give directions, like a security guard in an office building lobby. A bit of data comes in, shows its destination to the guard, and says, “Where do I go?” The guard then points in the direction of the correct elevator or stairway, which is the backplane: the fixed paths between the router’s entrances and exits. The third key element is the “line cards,” which make the logical decision about which way a bit should go; they’re like the security guard. Finally there are the “optical modules,” which send and receive optical signals and translate them to and from electrical ones. A line card is really just a multiposition switch—hardly different from the input selector on a stereo. An optical module is a light—a bare bulb switching on and off. What makes it miraculous is its speed.
“So a gig is a billion,” Westesson said, nonchalantly. He held in his palm an optical module of a type known as an SFP+, for “small form-factor pluggable.” It looked like a pack of Wrigley’s gum made of steel, felt as dense as lead, and cost as much as a laptop. Inside was a laser capable of blinking on and off ten billion times per second, sending light through an optical fiber. A “bit” is the basic unit of computing, a zero or a one, a yes or no. That pack of gum could process ten billion of them per second—ten gigabits of data. It inserted into a line card like a spark plug. Then the line card slid into the chassis like a tray of cookies into the oven. When “fully populated,” an MLX-32 could hold well over a hundred optical modules. That means it could handle one hundred times ten billion, or a thousand billion bits per second—which adds up to the unit known as a “terabit,” about what was flowing through that MLX-32 in Frankfurt on the Monday afternoon I visited.