Electric Universe

Home > Other > Electric Universe > Page 14
Electric Universe Page 14

by David Bodanis


  Our universe has something else—and this gives rise to a third possibility.

  When Turing went on his long runs, he often passed along hilly country paths, or even raced directly on the sandy beaches with which Britain, being an island, is so well endowed. Those sands and hills are composed in large part of the element silicon, as indeed is much of the surface of our planet—Mount Everest is largely made of silicon.

  Radio technicians had been irritated by silicon for a long time. It didn’t fit into the two accepted categories like everything else: it wasn’t a metal that always conducted electricity, nor was it like glass or diamond, which don’t conduct electricity. It was different, confusing. Much of the time, if a fragment of silicon was in a circuit it would seem to act like an ordinary, run-of-the-mill insulator. That was fine. If you led a wire into silicon, then any current traveling through the wire would skid to a halt when it reached that barrier.

  What bothered radio technicians was that silicon didn’t always act like this. Sometimes a chunk of silicon that you thought would be an insulator would somehow undergo what seemed to be an unseen twisting and stretching inside, and then it would no longer be a good, dependable, happy-to-stay-at-home insulator. Instead it would now act like a conductor, carrying floods of electrons along. It was neither one nor the other. It was a “semiconductor.”

  Silicon was so fickle that when the great research division at Bell Labs began efforts to create nonmechanical switches, one of its first directives—as wise as Disney firing Jeffrey Katzenberg just before he produced Shrek—was to cancel all research on silicon. Luckily, Bell Labs was also very big, and directives in large companies are easy to sidestep. There was at least one researcher there, Russell Ohl, who’d been intrigued by silicon’s changeable nature for years. He put fragments of it in the circuits of radio receivers, which he then placed in his son’s baby carriage. Then he’d stroll around New York, getting the pleasure of measuring exactly when silicon conducted electricity or not, while also giving his son a good airing. He was enthusiastic about the prospect that this sensitive, fickle substance could someday be useful. After Ohl’s son outgrew the carriage, their happy silicon excursions were out, but Ohl still wanted to take silicon research further in his labs. When Bell tried to cancel his research, he finagled a way around the company’s directives and kept a silicon group on.

  By 1946 and then 1947, what was happening inside silicon was finally becoming clear, due to that early work of Ohl and others. Silicon will sometimes form perfect crystal lattices, stretching like a dizzying M. C. Escher drawing, where a three-dimensional scaffold stretches on to infinity. But perfection is hard to find on our planet. When silicon is found in the wild, or melted and then cooled down in a factory, it’s more likely that tiny cracks and gaps will appear in the perfect scaffold. A few intruder atoms such as phosphorus readily slide into cracks in the lattice, and they bring intruder electrons. Nice, vulnerable extra electrons.

  If the only thing those electrons did was travel along swiftly, the silicon would simply have become another switch that was always “on.” But Ohl and the others knew enough of quantum mechanics to realize that the electrons which shared the spaces inside a silicon lattice could affect and sometimes slow each other, even over distances that on the atomic scale were huge. When just the right number of intruder electrons were put in there, these strange effects could be adjusted to suddenly make it impossible for the electrons to leap along in the way that would transmit an electric current. But with a slightly different adjustment from outside, produced by Bell researchers peering down at their lab-bench creations, poking and prodding and adding different substances or applying—ever so carefully—different force fields, the bizarre “slowdown” effects would end, and the electrons could be let free to quickly speed forward again.

  The chemistry was too hard for any single individual to handle, and Ohl was getting out of his depth. The resources of Bell Labs began to be shifted to Walter Brattain, a quiet experimentalist who’d been a rancher in Oregon, and John Bardeen, an even quieter theoretician from Wisconsin. (Bardeen was so quiet, and young-looking, that when he’d asked older students at the University of Wisconsin to play billiards for money they’d invariably taken up his offer. He was just as quietly polite when he took their money after the games—he was a superb player, and one of the best hustlers the campus had known.)

  Now, in the fourth-floor offices of an innocuous research block in Murray Hill, New Jersey, the two friends Brattain and Bardeen used insights from Ohl and from chemistry teams at Purdue; they used translated documents recounting abortive German wartime research on semiconductors; they used their knowledge of quantum mechanics and of new chemical fabrication techniques, and they never let up. In October 1947 they were getting the first signs of success, and by December 1947 they were certain. They could make electrons start flowing, and they could also make them stop. They’d built the atom-level on/off switch Turing had sought.

  It was one of the great discoveries of modern times. In all of human history, mankind’s labor has been held back by the awful force of friction. Hoes scrape and drag along the ground. The slaves who built Egypt’s pyramids spent almost all their energy in overcoming friction within their shoulder and leg muscles, or between the huge slabs they were moving and the ground underneath. Steam engines and car engines and even the fastest airplane’s jet engines also waste tremendous energy on overcoming friction. But these silicon rocks can shift electric currents through in one direction or another, and the rock itself doesn’t have to move, swiveling bodily like a big metal switch to one side or the other. That would be too slow and cumbersome. The rock can simply sit there, Buddha-like, and transform internally, allowing streams of electrons to be shunted down veins of modified ore inside it as needed.

  If Turing wanted to send an electric current through, but only when a particular decision was made, he would just have to lead that current up to one of these special ore veins. At first the current would wait there, dripping electrons uselessly, unable to cross. But let that ore vein be transformed by the subtle techniques Brattain and Bardeen had developed, and then everything changes. The ore vein “tunnel” changes, and the signal can now roar along.

  A new technology was born, which meant that someone had to come up with a new name. The general label wasn’t too hard. When you’re working to control large mechanical objects, it’s natural to say you’re working on “mechanics.” Here, controlling individual electrons, the field had already received the understandable name of “electronics.” But what should the key device that this technology used be called? This was a worrying moment, for engineers are often a menace when it comes to words. There were discussions, and votes, and one proposal was the not-quite-graceful “surface-state amplifier,” while another proposal was the even harder to pronounce “iotatron” (from the Greek letter iota, signifying small).

  Mercifully, though, John Pierce was brought in, one of the lab’s engineers, who dabbled in science-fiction writing and had a gift with words. He focused on the central insight. When the ore veins inside the silicon were “on,” then lots of electric current could cross. When they were “off,” there was high resistance to any current crossing. This meant that the device transferred a resistance, whence his more euphonic proposal:

  “We have called it the Transistor, T-R-A-N-S-I-S-T-O-R,” Bell’s research director explained, at the press conference launching it on Wednesday, June 30, 1948. “It is composed entirely of cold, solid substances.”

  The first applications came easily. The Bell companies had long had a tradition of making devices to help the deaf—a legacy from Alec’s love for Mabel—and it was natural to use transistors to make them smaller. (This was helped by the fact that John Bardeen himself was married to a woman who was hard of hearing.) Hearing aids are a bit like phones that have to be carried around. But ordinary telephones were clunky machines, dating from the Victorian era, with big wires and switches. They needed many trillion
s of bouncing electrons to transmit even a dim whisper. With transistors inside, however, it was possible to rely on far smaller batteries, for fewer electrons needed to be shunted around.

  Computer designers learned what was going on, including Turing in England, as with his 1948 letter from Jack Good. In America the ebullient Grace Murray Hopper, working at Harvard, took such confidence from knowing that improved microscopic switches were coming up that it seems to have helped her in developing the world’s first “compiler”—an indispensable part of modern computers, which translates the programmer’s instructions into the arcane listing of switch positions inside the computer. She had followed women’s basketball for years, and often noted how a forward pass took place: the person throwing the ball imagined the spot where she expected a teammate to catch it, and began the throw even before her target was there. Hopper liked explaining in later years how she’d used that image in the logic of her earliest compilers, to send out instructions that would be waiting in advance of the actual switch-shifting the computer was undertaking.

  But when Hopper did her pathbreaking work on compilers in 1952—four years after the Bell press conference—she still had no workable transistors to use. The team that had been so successful in 1947 had broken up. Partly it was because Bell executives wanted solid, reliable components for their continent-wide phone switching system. Their goal was to have parts that would fail only once every twenty years, yet the first transistors failed almost every day. (An entire batch was once ruined when technicians touched it after opening a door: their hands transferred enough copper atoms from the doorknob to destroy the silicon’s ideal mix of extra atoms which could slow or energize electrons as needed.) Bardeen and Brattain were discouraged by the lack of support.

  They also suffered from being under the ostensible supervision of William Shockley at Bell, and Shockley was a man with very strange views. When he’d first met Bardeen’s wife, Jane, he’d told her that his own children were inferior to him. Jane had demurred, thinking that her hearing aid had failed. But no, Shockley explained, it was a fact, and the reason was that his own wife was genetically inferior to him as well.

  When Bardeen and Brattain built their working transistor, Shockley was beside himself. How could these men have gotten there first! Bardeen wasn’t the one who was supposed to come up with great ideas. The fact that Brattain, from the cattle ranch in Oregon—and so a cowboy! a hick!—had been part of the discovery made it even worse. Shockley tried to take credit for the work. At the 1947 press conference he hogged the microphone; when an electronics magazine came to take a photo of the great transistor discoverers, he pushed Bardeen and Brattain aside and sat himself at the lab desk they had used. Although he improved their initial ideas considerably, that wasn’t enough; he wanted everyone to believe he had, basically, done it all. Bardeen left, and then others, and with no one remaining to yell at, soon Shockley left as well.

  Although this meant no transistors were mass-produced in time to save Turing’s life, it did have a remarkable benefit. For Shockley was such a good liar that when he left Bell Labs to create his own fortune in the apricot groves and scattered factories of a valley south of San Francisco, many of America’s most skillful engineers and physicists wanted to work with him. He was, after all, the man on the cover of Electronics magazine, peering into a microscope at his desk at Bell Labs; he had been fêted at the press conference announcing the transistor; there were rumors that a Nobel Prize might soon be his. He also regularly pointed out how stupid the staff were at every competing company. What young engineer wouldn’t want to travel to this semi-rural valley and become rich?

  They came, they saw Shockley in action, and they fled. But while the engineers he’d forced out of Bell Labs dispersed across the entire country, those he forced from his new company (modestly named after himself) liked the California sun so much that they didn’t go far. Shockley became a vast centrifuge, an inadvertent innovation machine. The bright people his reputation attracted quickly bonded with one another when they realized how awful he was, and kept those bonds when they were flung out to create their own firms nearby.

  He managed to lose not just Robert Noyce, who co-created the modern technique for printing vast numbers of transistors on individual chips, but also Gordon Moore, who co-founded Intel, the most successful of all companies for fabricating those chips. Noyce became a millionaire, Moore probably a billionaire, while Shockley—never making any money, unable to get the ingrates who flocked to him to recognize his genius—kept on repelling more ambitious bright engineers, who joined forces with his competitors. The apricot groves had a new name: Silicon Valley was born.

  And the world changed again.

  New technologies can transform a society, and from that valley came new technologies galore. Without transistors and the fast-switching computers they made possible, we’d have no cell phones or weather satellites or spy satellites; no CAT scans or MRI or GPS; no cruise missiles or smart bombs; no solar cells or digital cameras or night-vision goggles; no laptops; no spam, but also no e-mail and no Internet; no widespread credit cards or cash dispensers or scanners or spreadsheets. There’d have been no mapping of the human genome; no plasma-screen TVs; no CDs or LEDs or DVDs or iPods or in-flight screen entertainment. There’d be no billionaires named Gates or Jobs; no—and how this will date us—Amazon.com or eBay or Google or Pixar.

  At first it was easy to keep the world the way it had been, and just add on a few of the new items. But small gadgets can have unexpected effects. The first transistor radios went on sale in the 1950s, and the quantum effects they harnessed used so little power that their batteries were small. This meant kids could carry them around, which meant they no longer had to listen to the same music their parents did. Teenagers more and more formed their own subculture, and a new market for popular music was born. With cheap electric guitars and low-cost amplifying speakers—also made possible by silicon—small groups could match the volume of big bands. Obscure start-ups could flourish. Elvis and then Motown and the Rolling Stones appeared.

  Transistor technology on its own didn’t create rock and roll. There were lots of other trends at work—a big surge of young people coming of age in the postwar baby boom; a strong feeling after World War II that racism was unacceptable (which led to civil rights, and the sharing of black and white musical styles in Elvis’s first Memphis studio); ever more suburbs and affordable cars for kids to party in. But electronics accelerated all those trends, melding them in a way that might never have happened otherwise.

  The landscape changed. Huge chains of retailers could use computerized inventory control to fine-tune their offerings and to lower their costs in a way traditional stores couldn’t. Behemoths such as Wal-Mart began clomping across the landscape; malls became virtually indistinguishable from one another as they filled with the same trendy chain stores.

  Jobs changed. Executives found themselves checking their own spelling. Traditional manual occupations were sucked inside computer chips, and this changed the neighborhoods where those workers had lived. A man who works on the docks has clear rules at work, and passes on the notion of clear imperatives to his children. The children in that neighborhood might fight at school, but those fights also tend to have shared rules; teenage girls will be harassed, often roughly, but it’s still often limited by shared principles, about dating, dress codes, and the like. Yet when the dock jobs go, replaced by computerized instructions for automatic cranes, children lose the role-model of parents who they see having to follow such rules. Solid, blue-collar jobs have crashed since the 1970s. Neighborhoods left behind have fallen apart. Fresh mixes have been created.

  Even in richer areas, traditional notions of community began to fade. Radio and TV, when they began, sent out signals that flew equally in all directions, hence the word broadcasting. This encouraged simple national brands and simple large clusters of consumers. Even when mail-order catalogs were delivered, there were only a few sorts, sent in huge batche
s that couldn’t target niche customers. Computer switches, however, can quickly sort through many, many choices. This led to targeted direct mail (by the early 1960s), and soon after to increasingly specialized radio stations, cable television stations, and the like. People no longer had to respond as part of a group. There was more nomadism, and more starkly individual choices of where to live, whom to marry, how to worship, and when to vote. Strange things happened. An older generation, which accepted that exercise was something a few professional athletes might have to do, produced offspring who would go to large rooms, talk to nobody, and struggle to enhance their own utterly individual muscle tone.

  Democracy changed. Before the computerized satellite links of the early 1960s, ordinary people never expected to see vivid, real-time footage of foreign disasters, rebellions, or famines. (All they did occasionally see were brief extracts, edited in occasional newsreels at the movies.) It was natural to defer to government leaders, who had their own superior sources of information—generally ambassadors or other emissaries, who communicated with them at relatively great expense by telex, telegram, or plane. But now? At the moment when a fresh television image rushes in from abroad, no one knows more than anyone else. A new mistrust of government was born—helped as always by other factors—and has persisted ever since.

  Turing’s progeny spurred one another on. Probably the last computers that could be entirely comprehended by one person were built in the late 1950s. But if you use a slide rule and a draftsman’s drawing tools to work out the connections needed in wiring up a thousand switches for a computer, you’re not going to go back to the slide rule when you want to try wiring up a million switches for a more advanced computer. You’ll simply use the first computer to do it for you. Computers have begotten computers of escalating internal power ever since.

 

‹ Prev