Book Read Free

The American Experiment

Page 23

by David M. Rubenstein


  Transistors, microprocessors, computer software, personal computers, smartphones, search engines, social media platforms, and artificial intelligence, among other tech innovations and inventions, created infinite new ways to solve problems, get information, communicate, conduct business, and live.

  And the geniuses, entrepreneurs, and technology creators who produced this revolution became household names for their creativity, drive, wealth, philanthropy, and, in some cases, eccentricities and lifestyles. David Packard, Walter Hewlett, Robert Noyce, Gordon Moore, Andy Grove, Steve Jobs, Bill Gates, Jeff Bezos, Sergey Brin, Larry Page, Mark Zuckerberg, and Elon Musk, among others, became tech and business icons as they helped shape the modern world.

  Were they geniuses? Did they get lucky? What were their secrets? There is no simple answer, but Walter Isaacson provides perceptive insights into these questions in his book The Innovators.

  Walter is an extraordinary writer, while also having many other legendary intellectual skills. He has focused his best-selling books on individuals who could fairly be seen as geniuses—Henry Kissinger, Benjamin Franklin, Albert Einstein, Steve Jobs, Leonardo da Vinci, and, most recently, Jennifer Doudna (recipient of the 2020 Nobel Prize in Chemistry for the CRISPR gene-editing process).

  Those individuals, as well as the many individuals described in The Innovators, have several things in common. One is that they tended to build on what had already been discovered or invented—virtually nothing was created out of “whole cloth.” And two, they often worked as part of a team. Lone, madcap geniuses tended to be an image rather than the reality.

  This interview took place virtually on October 5, 2020. I have interviewed Walter about almost all of his books. How he wrote so many of them while having a full-time job like chairman and CEO of CNN or president and CEO of the Aspen Institute is something I cannot really understand. He says he writes at night and is not distracted by television. He does not own one. If I had only known that was the secret to great writing.

  * * *

  DAVID M. RUBENSTEIN (DR): In your research and writing about great innovators like Leonardo da Vinci and Franklin, Einstein and Jobs, have you found any common traits? Do they tend to be the leaders, the loners, often associated with individual creativity and genius? Or is the reality more of a collaborative effort?

  WALTER ISAACSON (WI): There are two interrelated traits that are common to all the innovators I’ve written about. The first of these is curiosity—pure and passionate and playful curiosity about everything. Like Benjamin Franklin, as a teenager, going over to England for the first time and measuring the water in the ocean because he’s trying to figure out how does the Gulf Stream work. Or Leonardo da Vinci, my favorite, who in his notebooks writes things in the margins like “Describe the tongue of the woodpecker.”

  Who wakes up in the morning and wants to know what the tongue of a woodpecker looks like? A curious person does. And that’s Leonardo.

  Einstein and Leonardo both wrote in their notebooks, “Why is the sky blue?” Now, we all see blue skies, but we forget to be curious about “Why is it blue?” And of course Steve Jobs had a voracious curiosity.

  The other thing about their curiosity was that it crossed all sorts of fields. Whether it was Steve Jobs being curious about calligraphy and coding or Leonardo being curious about art and anatomy, they wanted to know everything you could know about everything knowable.

  That curiosity leads you to be interested in all sorts of disciplines, which means that you can stand at the intersection of the arts and sciences. To me, that’s where creativity occurs—at that intersection. People can have a foot both in engineering and in the humanities or in science and the arts.

  So, a wide-ranging curiosity, one that allows you to see that patterns exist across nature and how those patterns ripple, whether it’s Leonardo da Vinci’s spirals of water becoming the curls of hair of the Mona Lisa or Steve Jobs understanding the beauty of calligraphy and ingraining that in his first Macintosh.

  DR: Is there something about the way the U.S. developed from its early days that encouraged innovation or creativity?

  WI: Ingrained in the DNA of our country is that the people who came here were either pioneers or refugees. They were the second and third children who had to strike out on their own. They were people escaping oppression and looking for freedom. They were going on an errand into the wilderness.

  They are people who are used to uprooting, changing their minds, and being part of a frontier. Whether that was the literal frontier that existed until around 1900 or things like the electronic frontier, people in the United States were more willing to uproot from their Old World and take the risks of embarking on an errand into the wilderness.

  DR: In recent decades, the science and technology worlds appear to be dominated by American companies. Is that because the U.S. seems to be more encouraging of innovation than, to mention one area, Europe?

  WI: It’s partly because the U.S. has less regulation and allows more freedom, and the fields in which there was less regulation were the ones where the most innovation happened. We think that the U.S. is the most innovative because we look at things like the Internet and computers and social networks and social media, which tended to be rather unregulated things. In fields that are more regulated, whether it be physical things like batteries and nuclear power or air travel or even, to some extent, pharmaceuticals, the U.S. doesn’t have quite the same advantage.

  Secondly, the U.S. is better at allowing people to take risks. Especially out in Silicon Valley, if you haven’t failed two or three times, nobody’s going to take you seriously. Whereas in Europe, if you fail once or twice, you’re probably not going to get your foot in the door looking for financial backing.

  DR: When did American leadership in such areas, like computers and semiconductors, really start?

  WI: It happens right after World War II. During the war, Germany, Britain, and the United States were all developing digital computers, mainly to calculate missile trajectories and other wartime needs.

  As Leonardo da Vinci knew from working for the warlord Cesare Borgia and the duke of Milan, war tends to stimulate technology. And that’s what happened with computers.

  But the genius in America is that right after World War II, leaders like Vannevar Bush came up with the concept that we had to have science as the next frontier. They said that there was going to be a three-way partnership between universities, government, and corporate America, that the new types of labs for computers weren’t going to just be done in the government, the way the atomic bomb was done with the Manhattan Project. We were going to have places like RAND and Bell Labs, and the government was going to create the National Science Foundation to give grants to universities to do research.

  ENIAC was really the first general-purpose computer, invented by the War Department and the University of Pennsylvania during World War II. That spins out into a private company, which becomes UNIVAC and eventually Unisys and Sperry Rand. The ability to have that three-way partnership distinguished the United States from other countries.

  DR: What was the impact on innovation from returning World War II veterans who had technology training?

  WI: I’ll tell a personal account. Innovation in America is not just done by huge companies; it’s about thousands of foot soldiers in the progress of innovation.

  My father left Tulane his senior year, with six buddies from engineering school, to make sure they could join the navy before the end of World War II. They were trained in radar and supply chains and even refrigeration and sent to the South Pacific.

  When my father got back in 1947, he became an electrical engineer, because he had been trained. He invented new ways to air-condition department stores and movie theaters in New Orleans with his buddies.

  And he was just one of tens of thousands of returning World War II veterans who got their technology training but also learned to take risks that you have to take during wartime. He became a great innovator
as well as a small-business owner in New Orleans.

  DR: Did the growth of venture capital after World War II, and particularly from the 1960s onward, have an impact on fostering innovation?

  WI: When we think of great innovators we often think of scientists or engineers. But one of the most important inventions that happens in the 1960s is venture capital. It’s people like Arthur Rock, who had worked for investment banks in New York, who decides to go west and invest in new ventures.

  Up until then you had the Rockefeller family, Laurance Rockefeller and his siblings, doing things like Venrock. But you didn’t have firms that raised capital and said, “We’re going to bet on new entrepreneurs.” So, if you look at great innovators of the digital revolution, Arthur Rock and his successors really came up with a new invention, which is raising capital funds. Getting people to have equity stakes in new ventures that hadn’t yet started up. Being angels for entrepreneurs.

  Rock helps Robert Noyce and some of the rebels at Fairchild Semiconductor start what becomes Intel. That led to the birth of a venture capital industry, which was one of the ingredients that made Silicon Valley the cradle of innovation more than even Boston or New York.

  It almost echoes Florence five hundred years earlier, when the Medici family and others invented new ways of doing double-entry bookkeeping, so they could do debit and credit financing. That provided funding that helped the Renaissance flourish.

  DR: What were the big advances in computers in the post–World War II period?

  WI: The biggest advance was the realization that computers should be personal. In World War II, there were these huge computers, like Colossus that Alan Turing worked on in Bletchley Park in England, or ENIAC at the University of Pennsylvania. After the war, companies like Sperry Rand and Digital Equipment Corporation thought that computers were going to be huge machines owned by corporations and the government and maybe universities.

  What happened in the early 1970s, because of a confluence of forces, is that a group of people—hobbyists and hackers and rebels, computing-power-to-the-people types—decides, “Let’s make the computer personal.” That mind-set coincides with Intel creating microprocessors that allow people like Steve Wozniak and Steve Jobs to say, “We can make our own personal computers.”

  What distinguished the United States in its digital revolution from other places is that great entrepreneurs snatched computing power from the big corporations and turned it into personal computers. The advent of personal computers enabled creativity, entrepreneurship, and innovation in garages and garrets and dorm rooms around America from then on.

  DR: You think if they didn’t have garages, Silicon Valley would never have gotten anywhere?

  WI: Larry Page and Sergey Brin knew the Wojcicki sisters, who had two things: they had a garage, and they had a friend who was in the venture capital business. This program Page and Brin created called PageRank eventually became Google.

  DR: What led to the development of transistors? What was their impact?

  WI: During World War II, most of the computers used vacuum tubes, which only you and I are old enough to remember. They were like lightbulbs, and they burned out, and you had to replace them, and they were hot, and they used electricity.

  Right as the war was ending, people who had been engaged in the war effort, who had worked at Bell Labs, came back. They were given the task of figuring out how to replace vacuum tubes so that the Bell system could amplify phone calls from coast to coast without having all these vacuum tubes that would burn out.

  What allows Bell Labs to invent the transistor is that it was a place that mixed everybody from theorists, to practical engineers who had come back from the war, to experimentalists, to pole climbers with grease under their fingernails who strung phone lines, to people who understood how to turn something into a business. So in December 1947, a theorist like William Shockley, who understood and could visualize how electrons danced on the surface of semiconducting materials such as silicon, could pair with an experimentalist like Walter Brattain, who could take a chip of silicon and germanium and a paper clip and solder it together and put it underwater and see if it all worked.

  The transistor allows the digital revolution to happen, just like the dynamo or the steam engine allows the Industrial Revolution to happen. The invention of the transistor is the key thing, because the transistor is simply a tiny on/off switch.

  The digital revolution is based on the theory that information can be encoded as zeroes and ones—in other words, it’s on and off—and that you can build circuits that can manipulate this information and say “Yes, no, if this do that,” based on zeroes and ones. To make that work you needed an on/off switch that was tiny, and that’s what the transistor was.

  DR: What led to the development of semiconductors? How did that speed up technology development?

  WI: One of the distinguishing things about U.S. antitrust law and patent law is that it incents a big corporation, like the Bell system, to take a patent but license it out rather freely so that they wouldn’t be accused of an antitrust violation. Everyone from Fairchild Semiconductor to Texas Instruments, which originally was an oil field company, decides to license the transistor.

  They try to figure out how to make the transistor better. At what becomes Intel and also at Texas Instruments, they realized that you could etch many components, including transistors, on a single chip of silicon. That becomes the microchip, which is the next great advance in semiconductors.

  Underlying that theory is the same simple on/off switch. Semiconducting materials such as silicon can be juiced up in ways to become on/off switches.

  DR: What is a microprocessor? Is that the same as a microchip?

  WI: No. A microchip is when you take a lot of transistors, say, and etch them on a chip. But at a certain point in the 1970s, Intel figured out a way to take this chip and to put together all of the components you might need for a circuit—transistors and resistors and capacitors—and etch them all on the same chip.

  The subtle but huge breakthrough is that instead of just making this as a special-purpose chip, like for a calculator for a specific company, Intel made it so that those chips could be reprogrammed. You could take a chip that had all these components on it and program it to do whatever you want.

  That becomes a microprocessor, which is the kernel of a computer. Back in the early days of computers, these processing systems were huge. But after Intel invents the microprocessor, it becomes the heart of a computer on a chip that you can put in the palm of your hand.

  DR: When did the first minicomputers begin to replace the large-scale computers that IBM had developed?

  WI: In the early ’70s, after Intel creates the idea of a microprocessor, people like Ed Roberts, who ran a hobbyist company in Albuquerque, say, “I can use these types of microprocessors and make a kit so that hobbyists can build a computer.” That becomes the Altair, the first personal computer. It was just done for hobbyists and hackers. Didn’t have much use.

  As soon as Bill Gates saw that on the cover of Popular Electronics, he and his friend Paul Allen said, “We’re going to create software for this Altair.” In the meantime, at the Homebrew Computer Club up near Palo Alto, people like Steve Wozniak and Steve Jobs were hanging out. They said, “We can use this tiny microprocessor from Intel and we’ll build our own computer.” And they built the Apple I and then the Apple II.

  So hackers and hobbyists, as well as sort of these hippie-like Whole Earth Catalog–reading people, ranging from Steve Jobs to Ed Roberts, who wanted to take computing power away from the big companies and give it to the people, then the peace movement, free speech movement, power to the people—they all jell in places like the Homebrew Computer Club in the early ’70s.

  And the hackers and the hobbyists and the Homebrew types all start building their own computers. Out of that comes the Apple I, the Apple II, and eventually the Macintosh, but also many other computers.

  DR: You mentioned Bill Gates. W
hat led to his company becoming the dominant software producer for computers? There were many other companies producing software in the early days of the so-called software revolution.

  WI: Bill Gates had a singular insight, one of the most important, innovative insights in the business of technology, which is that it was not going to be about the hardware, it was going to be about software. And that eventually, whether you were Dell or Sperry Rand or IBM, the hardware would be pretty interchangeable, but whoever made the operating system software would be at the lead of innovation.

  Early on, when they were big computers owned by grand corporations, it was boys with their toys. The men made the computers and then they hired women like Grace Hopper and the six women who programmed ENIAC, thinking that programming was just a clerical thing that women could do. But when the inventors of ENIAC eventually create the company UNIVAC, they’re smart enough to hire Grace Hopper and the women who did the programming. And they create things like COBOL.

  There was a struggle between who was going to be in control, the hardware manufacturers or the software writers. Then Bill Gates, with help from his father, who was a great lawyer, figured out a way to write and adapt an operating system for a personal computer, and then not sell that software to IBM but instead give them a nonexclusive license to it. The software company, which becomes known as Microsoft, becomes more powerful than the hardware companies such as IBM and Dell and DEC.

  DR: Who really invented the Internet? The French had a predecessor called the Minitel. Why did that not take off around the world?

  WI: With all due respect to Al Gore, the Internet has many inventors. The reason the French system didn’t catch on is that it was a centralized system. One of the rules of innovation in the digital revolution is empower the fringes and decentralize and distribute authority.

 

‹ Prev