The Code
Page 3
THE FARM
The university at the center of this sleepy outpost had been unusual from the very start. Opened in 1891 by Southern Pacific Railroad tycoon Leland Stanford and his wife, Jane, Stanford University was no liberal arts enclave like Harvard and Yale, founded to educate clergymen and the literate gentry. The Stanfords weren’t pointy-headed intellectuals, and their charge was pragmatic: their university’s purpose was “to qualify its students for personal success, and direct usefulness in life.” They admitted women. Tuition was free. Stanford wasn’t a place for the elite; it was for working-class strivers like Leland Stanford once had been himself. It was to be a place where anyone, no matter how humble, might one day become a tycoon too. “We have started you both on the same equality,” Jane Stanford told the men and women of the first entering class, “and we hope for the best results.”2
What’s more, the founders bequeathed the school thousands of acres of land with the proviso that the property could be leased, but never sold. A grand campus of sandstone and red tile—so different from the Gothic quads of the Ivy League—was only one piece of “the Farm,” which stretched west, north, and south into open fields and grass-covered hills where horses and sheep grazed and students and professors picnicked and hiked.
Before World War II, a very similar landscape had stretched north and south from Palo Alto along the fertile Santa Clara Valley. The region had been best known for its prodigious harvest of prunes, occasionally making national headlines during its annual “prune week” (slogan: “Eat five prunes a day and keep the doctor away”). Travel magazines wrote glowing features and local writers penned sentimental poems praising “the Valley of Heart’s Delight.”3
Plenty of fruit trees remained in the middle of the 1950s, but prunes were beginning to give way to bigger things. Dotted with military installations and airplane factories, the state of California was the number-one recipient of the giant swell of federal defense spending during and after World War II. During the war, soldiers streamed through San Francisco on their way to the Pacific Theater. Thousands more civilians had migrated west for work in the state’s shipyards and armaments factories. Many stayed for good, and the military installations and defense contractors webbed up and down the West Coast roared to life as the Cold War intensified. While Seattle, Los Angeles, and parts of the Bay Area specialized in building big—warplanes, battleships—the towns and cities that stretched south down the San Francisco Peninsula specialized in building small. Thanks to federal spending, the Valley of Heart’s Delight was quickly becoming the valley of sophisticated electronics and instrumentation.4
The postwar explosion didn’t come out of nowhere. A few Bay Area start-ups had long been in the business of making the delicate, high-tech components that gave larger computing and telecommunications machines their power: vacuum tubes, wireless transmitters, electromagnetic tape.
Stanford was a critical catalyst from the start. Seed capital from the university’s first president, David Starr Jordan, and other faculty investors started up a wireless radio company called Federal Telegraph in a Palo Alto bungalow in 1909. Stanford grad Charles Litton worked at Federal Telegraph, then left to start his own wireless company in a Redwood City backyard in 1932. Litton’s fellow ham radio enthusiasts Bill Eitel and Jack McCullough left a company founded by another Stanford alum to start a pioneering manufacturer of exquisite, expensive vacuum tubes needed to power radar systems. Faculty member William Hansen partnered with Sigurd and Russell Varian to invent the klystron—a foundational wave-frequency technology—in a Stanford laboratory in 1937. The brothers commercialized their innovation a decade later as Varian Associates. Last but hardly least, Stanford alums Bill Hewlett and David Packard scraped together $595 to start an electronic instrumentation firm in a Palo Alto garage in 1939.5
The military had a prewar presence as well. In 1930, the heyday of giant gas-filled dirigibles, the Santa Clara Valley beat out San Diego to become home to a major U.S. Naval blimp station. The tipping point came when a group of local boosters pooled resources to buy the thousand acres the Navy needed. It also didn’t hurt that the man signing the authorizing legislation for the station, President Herbert Hoover, was a Stanford alumnus with strong local ties. The result was Moffett Field, a major hub of aviation and aerospace research sprawling along the newly constructed Bayshore Freeway, straddling the border of Mountain View and Sunnyvale. It opened in 1933; the National Advisory Committee for Aeronautics (NACA) opened a research center next door six years later.6
As tempting as it might be to read history backward, however, the San Francisco Bay Area was not unique. In the furiously industrious early decades of the twentieth century, you could find similar little clusters of young entrepreneurs in cities across the continent. High-tech products blossomed everywhere: automobiles in Detroit, biplanes in Dayton, cameras in Rochester, lightbulbs in Cleveland, radios in New York. Military installations dotted the landscape too. Yet Northern California quickly pulled away from, and ultimately surpassed, all the competition. The region did so because of extraordinary opportunity flowing its way in the 1950s—and because of extraordinary people who took advantage of that opportunity.
ARMY OF BRAINS
It began with the Bomb. To scientists and politicians alike, the technological mobilization of World War II—and its awesome and ominous centerpiece, the Manhattan Project—showed how much the United States could accomplish with massive government investment in high technology and in the men who made it. America’s wartime investment not only marshaled physics to produce the most fearsome and powerful weapon in human history, but it also catalyzed development of sophisticated electronic communications networks and the first all-digital computer—technologies that undergirded the information age to come.
Even before peace had been declared, leading scientists had started to make the case for continued public spending on blue-sky technological discovery. As Cold War chill set in after the hot war’s end, national security now depended on having the most sophisticated weapons in one’s arsenal. Government spending on research and development shot up further. Thus began an extraordinary market disruption that accelerated the growth of new companies, sectors, and markets.
The human catalyst for much of this change was an engineering professor with an odd name and a talent for connecting people and ideas: MIT’s Vannevar Bush, tapped by President Roosevelt to run the wartime Office of Scientific Research and Development (OSRD), an operation that mobilized thousands of PhDs and spent half a billion government dollars by war’s end. Bush also was co-founder of one of the century’s earliest high-tech start-ups, Raytheon, founded in 1922 to market gas rectifier tubes that provided an inexpensive and efficient power source for home radios.7
As the architects of the atom bomb labored in secrecy, Bush became the most prominent public face of the government’s research efforts. A 1944 Time cover story dubbed him the “General of Physics.” Yet the MIT man was more than a mere political operator or bureaucrat. He was an audacious, prescient technical thinker. In 1945, he published a long essay in The Atlantic that proposed a mechanized system for organizing and accessing knowledge. Bush called it the “memex,” a machine he cheerfully described as being designed for “the task of establishing useful trails through the enormous mass of the common record.” Generations since have hailed the memex as inspiration for the hypertextual universe of the World Wide Web.8
Bush’s more immediate impact, however, came from another of his networked creations: the OSRD’s great army of scientific men and university research laboratories, all mobilized at breakneck speed to provide the computational power needed to win the war. Making modern weaponry—from the millions of conventional weapons disgorged from the bellies of B-52s to the singular atomic bomb itself—was at its core a math problem, demanding thousands of lightning-fast calculations to determine a missile’s arc, a radar system’s nodes, a mushroom cloud’s spread. “More than a hundred
thousand trained brains working as one,” was how The New York Times described Bush’s scientists at the time, and elite brains they were: having to mobilize a team quickly, the man from MIT drew on people and institutions he knew the best.9
One of those people entering this close and clubby world of scientific men was a genial but intense Californian who had been Bush’s very first doctoral student: Stanford’s Frederick Emmons Terman. Like so many who would later make their mark on the tech industry, Fred Terman was a faculty brat, the son of famed Stanford psychologist and IQ-testing pioneer Lewis Terman. (The elder Terman’s research into intelligence testing was only a shade removed from the stridently racist work of Stanford’s founding president David Starr Jordan, whose investigations of human “fitness” made the university’s now-notorious early reputation as a hub of eugenics research.) Fred Terman forged a wholly different path, heading east in the early 1920s to study electrical engineering at a place that at the time was still known as “Boston Tech.”10
After finishing his MIT degree in two years, Fred Terman returned home to become the hardest-working man on the Stanford faculty, working seven days a week and relishing every moment. He spent his brief moments of leisure playing competitive bridge. When once asked why he never took vacations, Terman replied, “Why bother when your work is so much fun?” At one point, he was the main advisor for half of the graduate students in his department. Terman also embarked on a lifelong project of encouraging the best of them to strike out on their own as entrepreneurs. Nine months before Hitler’s invasion of Poland, it was Terman who had persuaded two of his most beloved Stanford protégés, Hewlett and Packard, to start up their eponymous company across town.11
Although deeply loyal to his university and hometown, when Terman received Bush’s call to come back to Boston and join the grand project of war work, he didn’t hesitate. Destination: Harvard, where Terman headed up a lab devoted to “radar counter-measures.” The furious pace of war work was a perfect fit. “Everything is just ducky here,” wrote his wife, Sybil, to her sister, but “Fred is so busy, I don’t see how he lives.”12
Terman’s wartime experience was typical. Needing to move fast and produce immediately applicable results, Bush’s research operation operated largely via outsourcing, sending contracts for basic and applied research to university laboratories, and pulling in a team of experts from all over—even if it meant asking people to move across the country for the duration of the war. The prominent role of government was a significant culture shift for American universities, as was the scale of spending. Where before a department counted itself lucky to receive a private donation of a few thousand dollars for an industrial research project or a modest amount of foundation work, now it regularly received government grants many orders of magnitude larger.13
The bounty wasn’t evenly spread, however. Massachusetts institutions alone received one-third of all the money spent by OSRD. Most of that third went to one place: MIT’s Radiation Laboratory, or “Rad Lab,” tasked with developing a top secret system of radar technology to win the war. (The Rad Lab was offense, and Fred Terman’s lab was defense, developing technologies to jam enemy radar.) Thanks to the fat contracts going to the Manhattan Project architects affiliated with the University of California, Berkeley, California came in second. New York State, home to most of the nation’s largest electronics companies, followed closely on its heels. Everywhere else lagged far behind.14
BIG SCIENCE
A few months after Bush’s Time cover splashed over the nation’s newsstands, Franklin Roosevelt wrote an open letter to his “General of Physics” asking for formal recommendations of how the government might encourage research on a permanent basis. The language of the president’s request was as audacious as Bush’s emergent idea of the memex, and likely took editorial cues from the ebullient science advisor himself. “New frontiers of the mind are before us,” Roosevelt wrote Bush, “and if they are pioneered with the same vision, boldness, and drive with which we have waged this war we can create a fuller and more fruitful employment and a fuller and more fruitful life.”15
An already ailing FDR did not live to see the results of his pitch. In July 1945, Bush delivered his report to a new president, Harry Truman. Titled Science, the Endless Frontier, the brief took up Roosevelt’s high-flying, politically evocative language, and ran with it. “The pioneer spirit is still vigorous within this nation,” wrote Bush. “It is in keeping with the American tradition—one which has made the United States great—that new frontiers shall be made accessible for development by all American citizens.” Scientific discovery could be America’s manifest destiny for the twentieth century, just as westward conquest had been to the nineteenth. The way to do it: a new agency run by scientific experts, a “National Research Foundation.” The next month, the detonation of atomic weapons over Hiroshima and Nagasaki gave Bush’s argument about technological capacity a grim and forceful proof point.16
From the very beginning, then, there was a cognitive dissonance in the way America’s postwar politicians and technologists talked about the world-changing upsides of high-tech investment—expanding the frontiers of knowledge, pushing out into the unknown, bettering society, furthering democracy—and the far more bellicose and disquieting reasons that this investment happened in the first place. Vannevar Bush talked about an “endless frontier,” but political leaders agreed to such lavish public spending on science and tech in order to fight endless war. Bush’s idea of a public agency devoted to funding basic research (now renamed the National Science Foundation, or NSF) became reality after Congress received the alarming news in 1949 that the Soviets had managed to build a bomb of their own, showing American leaders that the USSR clearly had a scientific capacity far greater than previously imagined. One year later, as the U.S.-Soviet conflict heated up with war in Korea, the Truman White House issued NSC-68, authorizing a spike in military spending—and in this age of physics-driven weaponry, this meant more money for science and technology. By the end of 1951, the U.S. had put more than $45 billion into military procurement.17
The “New Look” military strategy adopted by Dwight Eisenhower and his Secretary of State John Foster Dulles beginning in 1953 accelerated the shift toward advanced electronics, moving defense spending away from ground troops and conventional arms and toward ever-more-sophisticated weapons and the calculating machines that helped design them. Military planners reckoned that they’d need the electronics industry to increase its production levels five times over to accommodate national security demands.18
The significance of the push wasn’t just the considerable amount of money America’s political leaders agreed to start spending on science in the decade following the war—it was how they spent it. For the era also was the height of McCarthyite witch hunts, when bold, centralized government planning efforts smacked of socialism and authoritarianism. Thus, the National Science Foundation followed the precedent of Vannevar Bush’s OSRD: it did not conduct basic research itself, but allocated grants to university researchers through a highly competitive selection process. “Every idea,” wrote NSF officials in their first annual report, “must compete against all other ideas in the market place of science.”19
The same thing happened on the “D” side of R&D: the Army and Navy outsourced the job of designing and building high-tech weapons to private electronics and aerospace companies, reanimating industries that had boomed during wartime and slumped after V-J Day. Defense Department officials persuaded Congress to authorize bigger tax breaks for electronics plant construction, and bought companies the expensive machines needed to build military-grade equipment.
New technologies saturated every branch and activity of the Cold War military machine. From walkie-talkies soldiers carried in their pockets to radar systems strung across continents, electronic communications equipment powered nearly every aspect of the modern military. A single bomber now carried twenty different pieces of electronics, ea
ch costing as much as the whole plane had merely one decade before. Supersonic planes demanded sophisticated electronics to help their human pilots, because “the airplanes simply blast through space faster than the human mind can think,” as one aerospace executive put it. By 1955, thanks to the money being shoveled in the electronics industry’s direction, it had revenue of $8 billion a year—the third largest in the United States, behind only autos and steel.20
THE YOUNG AND TECHNICAL
The military-industrial complex ran on people power too. Ramping up R&D required thousands of physicists, engineers, mathematicians, and chemists with cutting-edge skills and a Terman-esque work ethic. Need far outpaced supply. Altogether, the nation’s universities had produced only 416 physicists and 378 mathematicians between 1946 and 1948. It was a classic chicken-and-egg problem. The military needed the best scientists. At the time, nearly all of them worked in universities. Lure these people away to work on defense projects, and frontier-busting basic research would suffer—as would the capacity of universities to produce more scientists. Pentagon spokesman Eric Walker observed dourly, “we are adding to a strategic national resource at a slow rate indeed.” By 1952, the NSF estimated that the U.S. was close to 100,000 scientists short. But military planners put a positive spin on it. “In a sense,” reckoned Truman’s defense mobilization chief, the former GE executive “Electric Charlie” Wilson, “shortages are a symbol of our progress.”21