The Innovators

Home > Memoir > The Innovators > Page 21
The Innovators Page 21

by Walter Isaacson


  As the wrangling dragged on, the patent office confused things a bit further by ruling, in June 1964, on Kilby’s original application—and granting it. That made the priority contest all the more important. It was not until February 1967 that the verdict finally came, in Kilby’s favor. It had been eight years since he had filed for his patent, and now he and Texas Instruments were declared the inventors of the microchip. Except that didn’t end things. Fairchild appealed, and the Court of Customs and Patent Appeals, after hearing all the arguments and testimony, ruled in November 1969 the other way. “Kilby has not demonstrated,” the appeals court declared, “that the term ‘laid down’ had . . . or has since acquired a meaning in electronic or semiconductor arts which necessarily connotes adherence.”14 Kilby’s lawyer tried to appeal to the U.S. Supreme Court, which declined to take the case.

  Noyce’s victory, after a decade of back-and-forth and more than a million dollars in legal fees, turned out to mean little. The subhead on the small story in Electronic News was “Patent Reversal Won’t Change Much.” By this point the legal proceedings had become almost irrelevant. The market for microchips had exploded so rapidly that the businesslike folks at Fairchild and Texas Instruments realized that the stakes were too high to leave to the legal system. In the summer of 1966, three years before the final legal resolution, Noyce and his Fairchild lawyers met with the president and counsel of Texas Instruments and hammered out a peace treaty. Each company granted that the other had some intellectual property rights to the microchip, and they agreed to cross-license to each other whatever rights they had. Other companies would have to make licensing deals with both, usually paying a royalty fee that totaled about 4 percent of their profit.15

  So who invented the microchip? As with the question of who invented the computer, the answer cannot be settled simply by reference to legal rulings. The nearly simultaneous advances made by Kilby and Noyce showed that the atmosphere of the time was primed for such an invention. Indeed, many people around the country and world, including Werner Jacobi at Siemens in Germany and Geoffrey Dummer of the Royal Radar Establishment in Britain, had earlier proposed the possibility of an integrated circuit. What Noyce and Kilby did, in collaboration with teams at their companies, was figure out practical methods to produce such a device. Although Kilby was a few months earlier in coming up with a way to integrate components on a chip, Noyce did something more: he devised the right way to connect these components. His design could be mass-produced efficiently, and it became the general model for future microchips.

  There is an inspiring lesson in how Kilby and Noyce personally handled the question of who invented the microchip. They were both decent people; they came from tight-knit small communities in the Midwest and were well grounded. Unlike Shockley, they did not suffer from a toxic mix of ego and insecurity. Whenever the topic of credit for the invention came up, each was generous in praising the contributions of the other. It soon became accepted to give them joint credit and refer to them as coinventors. In one early oral history, Kilby gently grumbled, “It doesn’t fit with what I understand to be co-invention, but that’s become accepted.”16 But he, too, eventually embraced the idea and was ever afterward gracious about it. When Craig Matsumoto of Electronic Engineering Times asked him about the controversy many years later, “Kilby heaped praise on Noyce and said the semiconductor revolution came from the work of thousands, not from one patent.”17

  When Kilby was told that he had won the Nobel Prize in 2000, ten years after Noyce had died,I among the first things he did was praise Noyce. “I’m sorry he’s not still alive,” he told reporters. “If he were, I suspect we’d share this prize.” When a Swedish physicist introduced him at the ceremony by saying that his invention had launched the global Digital Revolution, Kilby displayed his awshucks humility. “When I hear that kind of thing,” he responded, “it reminds me of what the beaver told the rabbit as they stood at the base of Hoover Dam: ‘No, I didn’t build it myself, but it’s based on an idea of mine.’ ”18

  MICROCHIPS BLAST OFF

  The first major market for microchips was the military. In 1962 the Strategic Air Command designed a new land-based missile, the Minuteman II, that would each require two thousand microchips just for its onboard guidance system. Texas Instruments won the right to be the primary supplier. By 1965 seven Minutemen were being built each week, and the Navy was also buying microchips for its submarine-launched missile, the Polaris. With a coordinated astuteness not often found among military procurement bureaucracies, the designs of the microchips were standardized. Westinghouse and RCA began supplying them as well. So the price soon plummeted, until microchips were cost-effective for consumer products and not just missiles.

  Fairchild also sold chips to weapons makers, but it was more cautious than its competitors about working with the military. In the traditional military relationship, a contractor worked hand in glove with uniformed officers, who not only managed procurement but also dictated and fiddled with design. Noyce believed such partnerships stifled innovation: “The direction of the research was being determined by people less competent in seeing where it ought to go.”19 He insisted that Fairchild fund the development of its chips using its own money so that it kept control of the process. If the product was good, he believed, military contractors would buy it. And they did.

  America’s civilian space program was the next big booster for microchip production. In May 1961 President John F. Kennedy declared, “I believe that this nation should commit itself to achieving the goal, before this decade is out, of landing a man on the moon and returning him safely to the earth.” The Apollo program, as it became known, needed a guidance computer that could fit into a nose cone. So it was designed from scratch to use the most powerful microchips that could be made. The seventy-five Apollo Guidance Computers that were built ended up containing five thousand microchips apiece, all identical, and Fairchild landed the contract to supply them. The program beat Kennedy’s deadline by just a few months; in July 1969 Neil Armstrong set foot on the moon. By that time the Apollo program had bought more than a million microchips.

  These massive and predictable sources of demand from the government caused the price of each microchip to fall rapidly. The first prototype chip for the Apollo Guidance Computer cost $1,000. By the time they were being put into regular production, each cost $20. The average price for each microchip in the Minuteman missile was $50 in 1962; by 1968 it was $2. Thus was launched the market for putting microchips in devices for ordinary consumers.20

  The first consumer devices to use microchips were hearing aids because they needed to be very small and would sell even if they were rather expensive. But the demand for them was limited. So Pat Haggerty, the president of Texas Instruments, repeated a gambit that had served him in the past. One aspect of innovation is inventing new devices; another is inventing popular ways to use these devices. Haggerty and his company were good at both. Eleven years after he had created a huge market for inexpensive transistors by pushing pocket radios, he looked for a way to do the same for microchips. The idea he hit upon was pocket calculators.

  On a plane ride with Jack Kilby, Haggerty sketched out his idea and handed Kilby his marching orders: Build a handheld calculator that can do the same tasks as the thousand-dollar clunkers that sit on office desks. Make it efficient enough to run on batteries, small enough to put into a shirt pocket, and cheap enough to buy on impulse. In 1967 Kilby and his team produced almost what Haggerty envisioned. It could do only four tasks (add, subtract, multiply, and divide) and was a bit heavy (more than two pounds) and not very cheap ($150).21 But it was a huge success. A new market had been created for a device people had not known they needed. And following the inevitable trajectory, it kept getting smaller, more powerful, and cheaper. By 1972 the price of a pocket calculator had dropped to $100, and 5 million units were sold. By 1975 the price was down to $25, and sales were doubling every year. In 2014 a Texas Instruments pocket calculator cost $3.62 at W
almart.

  MOORE’S LAW

  That became the pattern for electronic devices. Every year things got smaller, cheaper, faster, more powerful. This was especially true—and important—because two industries were growing up simultaneously, and they were intertwined: the computer and the microchip. “The synergy between a new component and a new application generated an explosive growth for both,” Noyce later wrote.22 The same synergy had happened a half century earlier when the oil industry grew in tandem with the auto industry. There was a key lesson for innovation: Understand which industries are symbiotic so that you can capitalize on how they will spur each other on.

  If someone could provide a pithy and accurate rule for predicting the trend lines, it would help entrepreneurs and venture capitalists to apply this lesson. Fortunately, Gordon Moore stepped forward at that moment to do so. Just as the microchip sales were starting to skyrocket, he was asked to forecast the future market. His paper, titled “Cramming More Components onto Integrated Circuits,” was published in the April 1965 issue of Electronics magazine.

  Moore began with a glimpse of the digital future. “Integrated circuits will lead to such wonders as home computers—or at least terminals connected to a central computer—automatic controls for automobiles, and personal portable communications equipment,” he wrote. Then he produced an even more prescient prediction that was destined to make him famous. “The complexity for minimum component costs has increased at a rate of roughly a factor of two per year,” he noted. “There is no reason to believe it will not remain nearly constant for at least ten years.”23

  Roughly translated, he was saying that the number of transistors that could be crammed, cost-effectively, onto a microchip had been doubling every year, and he expected it to do so for at least the next ten years. One of his friends, a professor at Caltech, publicly dubbed this “Moore’s Law.” In 1975, when the ten years had passed, Moore was proved right. He then modified his law by cutting the predicted rate of increase by half, prophesying that the future numbers of transistors crammed onto a chip would show “a doubling every two years, rather than every year.” A colleague, David House, offered a further modification, now sometimes used, which said chip “performance” would double every eighteen months because of the increased power as well as the increased numbers of transistors that would be put onto a microchip. Moore’s formulation and its variations proved to be useful at least through the subsequent half century, and it helped chart the course for one of the greatest bursts of innovation and wealth creation in human history.

  Moore’s Law became more than just a prediction. It was also a goal for the industry, which made it partly self-fulfilling. The first such example occurred in 1964, as Moore was formulating his law. Noyce decided that Fairchild would sell its simplest microchips for less than they cost to make. Moore called the strategy “Bob’s unheralded contribution to the semiconductor industry.” Noyce knew that the low price would cause device makers to incorporate microchips into their new products. He also knew that the low price would stimulate demand, high-volume production, and economies of scale, which would turn Moore’s Law into a reality.24

  * * *

  Fairchild Camera and Instrument decided, not surprisingly, to exercise its right to buy out Fairchild Semiconductor in 1959. That made the eight founders rich but sowed seeds of discord. The corporation’s East Coast executives refused to give Noyce the right to hand out stock options to new and valued engineers, and they sucked up the semiconductor division profits to fund less successful investments in more mundane realms, such as home movie cameras and stamp machines.

  There were also internal problems in Palo Alto. Engineers began defecting, thus seeding the valley with what became known as Fairchildren: companies that sprouted from spores emanating from Fairchild. The most notable came in 1961, when Jean Hoerni and three of the other eight defectors from Shockley left Fairchild to join a startup, funded by Arthur Rock, that became Teledyne. Others followed, and by 1968 Noyce himself was ready to leave. He had been passed over for the top corporate job at Fairchild, which ticked him off, but he also realized that he did not really want it. Fairchild, the corporation as a whole and even the semiconductor division in Palo Alto, had become too big and bureaucratic. Noyce yearned to shed some managerial duties and return to being close to the lab.

  “How about starting a new company?” he asked Moore one day.

  “I like it here,” Moore replied.25 They had helped to create the culture of the California tech world, in which people left established companies to form new ones. But now, as they were both hitting forty, Moore no longer had the urge to jump off the roof in a hang glider. Noyce kept pressing. Finally, as the summer of 1968 approached, he simply told Moore he was leaving. “He had a way of making you want to take a leap with him,” Moore said many years later, laughing. “So finally I said, ‘Okay, let’s go.’ ”26

  “As [the company] has grown larger and larger, I have enjoyed my daily work less and less,” Noyce wrote in his letter of resignation to Sherman Fairchild. “Perhaps this is partly because I grew up in a small town, enjoying all the personal relationships of a small town. Now we employ twice the total population of my largest ‘home town.’ ” His desire, he said, was to “get close to advanced technology again.”27

  When Noyce called Arthur Rock, who had put together the financing deal that launched Fairchild Semiconductor, Rock immediately asked, “What took you so long?”28

  ARTHUR ROCK AND VENTURE CAPITAL

  In the eleven years since he had assembled the deal for the traitorous eight to form Fairchild Semiconductor, Arthur Rock had helped to build something that was destined to be almost as important to the digital age as the microchip: venture capital.

  For much of the twentieth century, venture capital and private equity investing in new companies had been mainly the purview of a few wealthy families, such as the Vanderbilts, Rockefellers, Whitneys, Phippses, and Warburgs. After World War II, many of these clans set up firms to institutionalize the business. John Hay “Jock” Whitney, an heir to multiple family fortunes, hired Benno Schmidt Sr. to form J. H. Whitney & Co., which specialized in what they originally called “adventure capital” to fund entrepreneurs with interesting ideas who could not get bank loans. The six sons and one daughter of John D. Rockefeller Jr., led by Laurence Rockefeller, started a similar firm, which eventually became Venrock Associates. That same year, 1946, also saw the birth of the most influential entry, one that was based on business acumen rather than family wealth: the American Research and Development Corporation (ARDC). It was founded by Georges Doriot, a former dean of the Harvard Business School, in partnership with a former MIT president, Karl Compton. ARDC scored big by doing a startup investment in Digital Equipment Corporation in 1957, which was worth five hundred times as much when the company went public eleven years later.29

  Arthur Rock took this concept west, ushering in the silicon age of venture capital. When he put together Noyce’s traitorous eight with Fairchild Camera, Rock and his company took a stake in the deal. After that, he realized that he could raise a fund of money and do similar deals without relying on one corporate patron. He had a background in business research, a love of technology, an intuitive feel for business leadership, and a lot of East Coast investors he had made happy. “The money was on the East Coast but the exciting companies were in California, so I decided to move west knowing that I could connect the two,” he said.30

  Rock grew up the son of Russian Jewish immigrants in Rochester, New York, where he worked as a soda jerk in his father’s candy store and developed a good feel for personalities. One of his key investment maxims was to bet primarily on the people rather than the idea. In addition to going over business plans, he conducted incisive personal interviews with those who sought funding. “I believe so strongly in people that I think talking to the individual is much more important than finding out too much about what they want to do,” he explained. On the surface, he wore the cloa
k of the curmudgeon, with a gruff and taciturn style. But those who looked at his face closely enough could tell from the light in his eyes and the hints of a smile that he enjoyed people and had a warm sense of humor.

  When he got to San Francisco, he was introduced to Tommy Davis, a talkative deal maker who was investing the money of the Kern County Land Co., a cattle and oil empire flush with cash. They went into business together as Davis and Rock, raised $5 million from Rock’s East Coast investors (as well as some of the Fairchild founders), and started funding new companies in return for a chunk of the equity. Stanford’s provost Fred Terman, still seeking to build his university’s ties to the growing tech boom, encouraged his engineering professors to spend time advising Rock, who took a night course in electronics at the university. Two of his first bets were on Teledyne and Scientific Data Systems, which both paid off handsomely. By the time Noyce called him about finding an exit strategy from Fairchild in 1968, Rock’s partnership with Davis had amiably dissolved (their investments had shot up thirtyfold in seven years) and he was on his own.

  “If I wanted to start a company,” Noyce asked, “could you find me the money?” Rock assured him it would be easy. What could better fit his theory that you place your money on the jockeys—that you invest based on your assessment of the people running the company—than an enterprise that would be led by Robert Noyce and Gordon Moore? He barely asked what they were going to make, and at first he didn’t even think they needed to do a business plan or description. “It was the only investment that I’ve ever made that I was 100 percent sure would succeed,” he later claimed.31

 

‹ Prev