Decades later, after the lithium-ion battery had put GPS-enabled, satellite-linked computers in every middle-class pocket, had begun to make the long-standing dream of an electric car a reality, and had been widely cited as the key enabler of a clean-energy future, Bob Hamlen told the story of his time at Exxon with an audible tinge of regret. “Now there was one thing,” Hamlen said. “I talked to Stan [Whittingham]—we looked back and we think to ourselves: Why on earth did we never mix the lithium with carbon? Which, as you know, is what made the current lithium-ion batteries feasible. Well, that’s easier said than done because it takes particular kinds of carbon, and it might not have been so easy to identify without some past experience …
“I have my own personal saying, which is that quantum jumps are only in the eyes of the uninvolved. If you look carefully at every advance that’s ever been made that gets reported as a big leap, you find that people involved have been doing this for a long time. I know a lot about it.”
3
THE WIRELESS REVOLUTION
After the failure of Exxon’s battery venture, only an entirely new approach to the rechargeable lithium puzzle could revive the technology. It would come out of the laboratory of John Bannister Goodenough, a physicist turned solid-state chemist who in the two decades after the demise of Whittingham’s battery would help invent the three major strains of lithium-ion battery in use today, including the one that made the gadget boom of the 1990s and 2000s possible.
Goodenough is now a professor at the University of Texas in Austin. Eighty-eight years old and, technically, long retired, he still conducts research and tends to a flock of graduate students. He is tall, but these days his height is mitigated by a hunch; when he walks, he relies on a four-footed metal cane. Regardless, he has a commanding presence, with feathery, owllike eyebrows and a booming laugh that erupts unexpectedly and then lingers just long enough to make you uncomfortable. He has a grandfatherly demeanor and likes to offer unsolicited lessons on the Meaning of It All: “Don’t forget your roots,” he told me. And later: “Life goes by so quickly.”
Goodenough was born in Germany in 1922, to American parents. He spent the second eleven years of his life in a small town outside New Haven, Connecticut, where his father was an assistant professor of history at Yale. He recalls his childhood with a mixture of nostalgia and deep black melancholy. The nostalgia is for his dog Mack, the wisteria vine on the side of the house, the apple trees, the windmill in the backyard. Yet he writes of a “deep hurt that plagued my early years,” in his slim, self-published spiritual biography, Witness to Grace. His parents were college-town gadabouts. His closest friend as a kid was Mack. He struggled to learn to read, and so he finds it miraculous that at the age of twelve he was accepted to Groton, the prestigious Massachusetts boarding school.
At Groton he came into his own. “I was never homesick,” he wrote. “I was happy my parents did not come to visit to intrude on my new life.” After Groton he enrolled in Yale, with the goal of finishing as much college as possible before being called to war, which, in 1938, seemed inevitable. He took a freshman chemistry class to satisfy his science requirement and to keep medical school an option. He wrestled with the responsibility of a young man living in an insane world. “I began to understand that any meaning to a life is not the accolade of others, but the significance and permanence of what we serve,” he wrote.
What seemed most permanent to him was science, and he decided to begin with the fundamentals, studying philosophy of science and physics. As he approached graduation, his mathematics professor encouraged him to join the Army Air Corps as a meteorologist rather than signing up to be a war hero in the marines. Before long, Goodenough was dispatching aircraft across the Atlantic, first from a station in Newfoundland and then from the Azores. He made it through the war without major incident and remained stationed in the Azores until, in 1946, a telegram arrived instructing him to report back to Washington. Back in the States, someone had happened upon an unspent sum of government money and applied it to “reintegrating a few promising scholars to civilian life.” Thanks to the recommendation of one of his professors at Yale, Goodenough became one of twenty-one officers chosen to study either physics or mathematics at Northwestern University or the University of Chicago.
Before leaving for the war, Goodenough had spent time reading Alfred North Whitehead’s Science and the Modern World, and he made a decision: “If I were ever to come back from the war and if I were to have the opportunity to go back to graduate school, I should study physics.” He hadn’t thought about this in years, but the new opportunity seemed like an omen. He enrolled in the graduate physics program at the University of Chicago, where he learned quantum mechanics from Enrico Fermi. He wrote his dissertation on solid-state physics, measuring the “internal friction of iron wires doped with carbon or nitrogen.” Upon enrollment years earlier, he had ignored the registrar’s words of encouragement: “I don’t understand you veterans. Don’t you know that anyone who has ever done anything interesting in physics had already done it by the time he was your age?”
Goodenough would go on to do plenty of interesting things, just not, strictly speaking, in the field of physics. After completing his Ph.D., he took a job at MIT’s Lincoln Laboratory, the federally funded defense lab that was home to several early milestones in computing, including the invention of TX-0, the first fully transistor-driven computer. At Lincoln, Goodenough found himself involved in research on random-access memory; his group’s assignment was to develop a magnetic material that could store data for the gymnasium-size vacuum-tube computers of the era. He quickly found a niche as an interdisciplinary fixer, a liaison between engineers, chemists, and physicists, all of whom were working toward the same goal but none of whom spoke the same language. He developed an intuitive understanding of the interatomic terrain where chemistry, physics, and engineering collided.
Around the same time Goodenough arrived at Lincoln, two scientists at another famous lab a couple hundred miles down the eastern seaboard were inventing the cellular telephone. It was 1947. The Bell Laboratories researchers were named Douglas H. Ring and William Roe Young. Their idea was to spread a large a number of low-power base stations—essentially, radio transmitters and receivers—over a large geographical area, creating “cells” of radio coverage. When a user moves from one cell to the other, the signal is handed off from base station to base station, an innovation that would allow the system to reuse frequencies, thereby increasing the number of signals that the system can carry.
Nothing much came of the idea until the 1970s, when additional decades of research at Bell Labs, along with developments such as the commercialization of microprocessors and the advent of computerized switching stations that could control the handing off of calls between cells, begat the earliest experimental workable mobile phones, which two people could use to carry on a normal conversation without pushing buttons and saying “over and out.”
As the wireless telephone became more feasible, a race began between Motorola, king of the two-way radio, and AT&T, which in those days controlled 80 percent of the American phone market—a rivalry that would lead directly to the birth of the handheld mobile phone. On April 3, 1973, a Motorola researcher named Martin Cooper claimed victory when he made the first handheld mobile phone call while walking around the streets of New York City. Using his two-and-a-half-pound phone, he rang his rival at Bell Labs to let him know who had won.
On April 3, 1973, John Goodenough was still at Lincoln Lab, and though he wasn’t thinking about cell phones, he was getting restless. He was looking for a new way to apply his fundamental work in solid-state science to projects of societal importance. And in 1973, the most urgent project for a guy with his training was looking for new sources of energy. “It was obvious already in 1970 that our dependence on foreign oil was making the country as vulnerable as the threat of ballistic missiles from Russia,” Goodenough writes in his memoir. The only alternative energy sources that made sens
e to him were hydropower, geothermal, solar, wind, and nuclear. Hydropower was already widespread; geothermal was too geographically limited; he wasn’t a nuclear scientist. Process of elimination brought him to solar power. But he soon realized that solar power had the same problem as electric cars—no good way to store electricity—and so he turned his attention to energy storage.
At Lincoln, he worked on photoelectrolysis, fuel cells, and sodium-sulfur batteries, which he had been introduced to some years before when the Department of Energy asked him to help assess the potential of the Ford Motor Company’s 1967 breakthrough. Then in 1974, frustrated with the growing bureaucracy involved in government-funded research, he decided to leave Lincoln. He flirted with an opportunity to found a solar research institute in Iran using a $7 million grant from the shah. Before he committed to the Iran project, however, he was invited to make a less drastic move. A letter arrived from Oxford University encouraging him to try for an open position as the head of the university’s inorganic chemistry lab. He thought he had almost no chance. He was a physicist, after all; he had taken only two chemistry courses in his life. Somehow, though, he got the job.
When he arrived at Oxford in 1976, he turned to the energy-storage research that would dominate the rest of his career. He investigated methanol-air fuel cells and electrolysis before he turned to the design of new materials for lithium batteries. He knew about the titanium-based battery that Stanley Whittingham had developed at Exxon. He was also aware of its limitations. For safety reasons that the Linden, New Jersey, fire department would have understood, he decided against using a metallic lithium anode. Whittingham’s group had solved the combusting battery problem by switching to aluminum anodes, but Goodenough felt it necessary to move to an entirely new battery design.
Goodenough knew that a rechargeable lithium battery based on oxides (rather than sulfides such as the one Whittingham used) would reach a higher voltage. Deciding which oxide to focus on was difficult, but in 1978 he got a clue. An undergraduate—“I can’t remember the name of the girl, and I feel badly about that,” he said—wrote a thesis on the structure of the LiMO2 oxide, where M is a variable standing for any number of transition metals. It reminded Goodenough of research he had done back in the 1950s on lithium nickel oxides. He wondered how much lithium could be removed from it before it started to crumble. A Japanese physicist named Koichi Mizushima happened to be visiting from the University of Tokyo; Goodenough paired Mizushima with his postdoc, Philip Wiseman, a chemist, and gave them an assignment: Make variations on this compound using chromium, cobalt, and nickel, then see how much lithium you can take out before it becomes unstable. See what kind of voltage each substance would deliver in a practical lithium battery.
Mizushima and Wiseman pulled half of the lithium out, and two of the candidates were stable: good. And the voltage? An excellent 4 volts, a leap beyond the practical 2.4 volts that Whittingham’s battery delivered.
One of the keys to Goodenough’s breakthrough was that he turned the accepted logic about batteries—that you had to build them fully charged—on its head. By building a battery that began its life in the discharged state, he could use compounds that were completely stable in ambient air (that is, that didn’t have to be built in a dry room or an argon chamber) and get a higher voltage than was possible using the standard method of building a battery full of juice from the beginning. Yet Goodenough couldn’t find a single battery company in England, Europe, or America that wanted to license his invention. Not for any good reason. “Because it was unorthodox,” he said.
This reversal of orthodoxy is not, in fact, a problem. Today it’s the way the billions of batteries that power nearly all the world’s wireless gadgets are built. Which is why Goodenough likes to conclude the story of the lithium-cobalt-oxide cathode, the innovation that was essential for getting the lithium-ion battery into the marketplace, with this: “And that started the wireless revolution.”
Between Martin Cooper’s first cell-phone call in 1973 and Goodenough’s publication on lithium cobalt oxide in 1980, the cellular telephone had gone nowhere. Part of the problem was that the FCC took about a decade to regulate the frequencies the companies would use to carry calls. The major phone companies, for their part, thought that hardly anyone even wanted mobile phones, and certainly not enough people to justify building an entirely new infrastructure.
But that was just in the United States. By the end of the 1970s and the first years of the 1980s, small cell-phone networks were going up in Japan, Bahrain, Saudi Arabia, and all of the Scandinavian countries. In August 1981, Mexico City got North America’s first network. On August 24, 1982, the U.S. telecom landscape changed dramatically when the Justice Department broke up AT&T and established a rule that each market had to have two carriers. One of those carriers would be the local landline phone company. The other could be just about any entrepreneur who won the lottery in which the FCC was raffling off licenses to chunks of spectrum.
Finally, on October 13, 1983, in Chicago, the newly created Baby Bell called Ameritech Mobile Communications launched the Advanced Mobile Phone Service (AMPS), the first commercial cellular network. From a car outside Soldier Field, Ameritech’s president, Robert Barnett, made the first call, to Alexander Graham Bell’s grandson in Berlin. The New York Times called AMPS the “nation’s first commercial cellular mobile radio service, a computerized telephone technology that supporters say will vastly expand the capacity, and improve the quality of, car telephone service.” Some cities already had car-phone networks, but those systems, called Improved Mobile Telephone Service, were crude relics based around a single high-power antenna that had access to only twelve channels. That meant that an antenna could support only twelve car-phone calls simultaneously. AMPS used the scheme that the Bell Labs researchers had envisioned three and a half decades earlier: an array of small base stations, low enough in power that they don’t interfere with one another, covers a series of geographical “cells”; as a user travels from cell to cell, the base stations, controlled by a computerized switching system, assign that moving call a new frequency as soon as it enters a new cell, with no perceptible disruption in the conversation. Good thing, because at approximately $3,000 for an Ameritech car phone, a $50-a-month service fee, and rates ranging from 24 to 40 cents a minute, these calls were too expensive to be interrupted.
In 1984, the first handheld mobile phone went on sale—the production version of Motorola’s shoe-box-size DynaTAC 8000X, which ten years after being first unveiled in prototype form could be yours for only $3,995. The FCC had approved the phone only the year before, the same year Motorola first deployed its DynaTAC cellular network in Baltimore and Washington, D.C. The DynaTAC eventually became an ironic cultural icon thanks to the 1987 movie Wall Street. Michael Douglas, strolling along the beach in the Hamptons, talking into a cell phone the size of a toaster? That was the Motorola DynaTAC, the first of its kind.
Just as those first enormous phones went on sale, batteries began to make news again, though Goodenough’s lithium-cobalt technology still hadn’t found a taker. The early 1980s gadget boom, led by Sony’s Walkman, drove a major spike in consumption of disposable alkali batteries. The Walkman in particular was a power hog, because its batteries had to power a motor that turned the wheels of cassette tapes. The Walkman’s popularity was a gift to the battery industry. “We are really an adjunct of the microelectronics revolution,” the president of Rayovac told Forbes in 1982.
The Walkman and the gadgets it inspired were not, however, good for Japanese landfills. In the early 1980s in gadget-obsessed Japan, pollution from batteries became an urgent, high-profile environmental issue. A New York Times piece from 1984 makes it sound as if Japan was drowning in mercury-laden disposable batteries: “Mercury, a toxic metal used in most batteries, is starting to seep into the soil around garbage dumps … The leakage has raised fears that Japan is slowly being contaminated by the dry cells that power its calculators, cameras, portable stereos and wat
ches.” Mercury contamination wasn’t the only problem the Japanese had been having with batteries—there was also the “swallowing issue.” In growing numbers, infants had been swallowing ever-shrinking batteries, and, as the Times put it, “a swallowed battery can burn holes in the intestines and cause inflammations.” The director of Japan’s Poison Control Center told the newspaper that they suspected that there were some “5,000 cases of battery ingestion in Japan each year.” Together, the two issues caused serious problems for the battery industry. “Usually, nobody’s interested in batteries, but suddenly we’ve become very famous,” a spokesperson for the Japan Battery and Appliance Industries Association said.
The Japanese quest for new, nontoxic battery technology, combined with Japanese mastery of the consumer-electronics industry, began a reorientation of the battery industry, one that assured that when rechargeable batteries one day became vitally important—when all this was a matter of energy security rather than digital watches and fancier cell phones—Japanese companies would dominate. At the beginning of the 1980s, the American company Union Carbide, which owned Eveready, stood astride the world battery market. Globally, Panasonic was a powerhouse, but in the United States, Panasonic lagged behind the American players Duracell and Rayovac. Yet while Japanese companies spent the 1980s scrambling to develop the batteries of the future, the American companies largely stood still.
“At that time in the West there was really no interest in lithium batteries,” said Peter Bruce, who worked as a postdoc in Goodenough’s Oxford lab during the 1980s. The hot topic in solid-state chemistry in the 1980s was high-temperature superconductivity. With the energy crisis of the 1970s in the past, batteries were boring, and superconductivity—which was being portrayed in IBM television commercials of those years by a scientist in a lab coat levitating a piece of metal over a supercooled disc—seemed like magic. Both involved solid-state ionics, which made Bruce perfectly qualified to start a career in the sexiest science of the time. “That would have been the easy option, because there was a lot of funding around, and there was no funding in lithium batteries,” he said. But then he went to Japan for a month as a visiting professor, and while he was there he happened to visit a couple of electronics companies. “I was astonished to see their long-term planning and research to take lithium batteries from primary batteries—the kind you just use once and throw away—and to make these things rechargeable.” The visits reinforced what he already suspected: that no matter what, energy storage would one day be important. “There are certain things you just know are going to matter,” he said. “Energy is going to matter to society. You’re going to need energy. You’re going to need to move around with energy. These kinds of fundamentals don’t go away.”
Bottled Lightning Page 5