Book Read Free

Bottled Lightning

Page 2

by Seth Fletcher


  By the middle of the nineteenth century, the battery found use outside the lab, primarily as a power source for the telegraph. As the battery steadily improved, its uses grew. In 1859, the French physicist Gaston Planté achieved a major breakthrough: the first practical rechargeable battery, a primitive version of the lead-acid cells we still use to start our gas-powered cars. In 1881, the French chemical engineer Camille Alphonse Faure came up with a practical method for manufacturing lead-acid batteries. Soon a shady bunch of European patent scavengers and stock manipulators were trying to get rich on Faure’s invention, inflating the small-scale equivalent of a nineteenth-century dot-com bubble, and temporarily giving the battery business a bad reputation. But that didn’t stop the spread of the new technology. By the beginning of the twentieth century, lead-acid batteries were widely used to power telegraphs, manage the electrical load in electrical-lighting substations, and support electrical streetcar networks. By then, many of them were also driving cars.

  At the beginning of the automobile age, cars powered by gasoline, electricity, and steam all shared the road, and none was an obvious winner. Actually, electric cars had a strong early advantage. They were clean, quiet, and civilized. Gas-powered cars were unreliable, complicated, loud, and dirty. They could be started only with a firm turn of the starting crank, and when that crank backfired it was extremely effective at breaking arms. When they weren’t breaking down or inflicting pain, however, gas-powered cars offered something that electric cars couldn’t—decent driving range, extendable within minutes with a tin of gasoline from the general store.

  Thomas Edison loved the idea of the electric car. Electric cars were a natural, stabilizing, money-generating appendage to the electrical network he had spent his career building. Widespread adoption of the electric car would help sustain his direct current (DC) standard, because charging a battery from an alternating-current (AC) network required an additional piece of equipment, an AC-DC converter. He knew that battery technology would determine whether electric cars would thrive or lose out to the rapidly improving gas-powered car, and he happened to be looking for a new conquest. He had already made, lost, and remade a fortune—already invented the stock ticker, the lightbulb, the phonograph, and the motion picture. He had just closed down a disastrous attempt at mining iron ore in western New Jersey. And so in 1898, he began studying the literature on battery research, the first step in a quest that would dominate the next eleven years of his life.

  The battery project was a departure for him. For years he had railed against “storage batteries,” as rechargeables were called. He saw them as catalysts for corruption, the tools of scam artists. Now he was committed to bringing the technology into a new, respectable age, and he was confident that he would succeed. “I don’t think Nature would be so unkind as to withhold the secret of a good storage battery, if a real earnest hunt were made for it,” he wrote to a friend. He had no idea what he was getting himself into.

  Edison’s goal was to create a new battery that would triple the capacity of the most advanced lead-acid batteries of his day. He wanted to surpass lead acid by ditching both the lead and the acid, finding new metals and electrolytes that could build a battery that was not only more energetic but also longer-lived. Part of the reason for his choice of materials was that he believed an alkaline rather than acidic electrolyte would be necessary to build a lighter and longer-lived battery. But he was also competing against the market-leading Electric Storage Battery (ESB) Company of Philadelphia, which was owned by the New York tycoon William C. Whitney, and which controlled most of the patents on lead-acid batteries. Edison couldn’t chase them on their own well-established road. He would have to find a different approach.

  The romantic telling of this period of Edison’s life has the proudly anti-academic inventor scorning theory and, instead, systematically churning through every conceivably suitable substance—innumerable grades and forms of copper, iron, cadmium, cobalt, magnesium, nickel hydrate, along with any number of formulations of the electrolyte. As his biographer Matthew Josephson wrote, “The number of experiments mounted into the hundreds, then to the thousands; at over ten thousand, Edison said, ‘they turned the register back to zero and started over again.’ A year, eighteen months went by, and they had not even a clue.”

  In reality, he was not working blindly. He knew the literature. He was probably building on research conducted by scientists such as the Swedish chemist Waldmar Jungner, who had been doing pioneering work on alkaline batteries himself. Edison was also probably spying on his competition at ESB, which was racing to develop an improved lead-acid battery called the Exide.

  Because of the intensity of the competition with ESB, almost as soon as Edison chose a basic design for his battery he began promoting it. In 1902, he wrote an article for the North American Review reporting that his lab work had led him to “the final perfection of the storage battery,” a cell that used nickel and iron electrodes and a potassium-based electrolyte. He had his critics. In the magazine Outing, a writer named Ritchie G. Betts mocked Edison for promising “a featherweight and inexhaustible battery, or one which may, by the twist of a wrist or the pass of a hand, draw power, and be recharged from the skies or the atmosphere or whatnot, and lo! all problems are solved! The ideal automobile is at hand!” But the critical voices would be overwhelmed by a press infatuated with the myth of Edison, the Wizard.

  By 1903, Edison’s workers were dropping his nickel-iron batteries into cars and logging miles, and conducting primitive abuse testing by throwing batteries out of third-story windows of their Orange, New Jersey, lab. By the following year, they had pushed the battery to impressive new levels of capacity: 14 watt-hours per pound, 233 percent better than the lead-acid batteries of the day. It wasn’t quite triple, but it was close enough.

  Edison launched his Type E nickel-iron battery with a level of hype and overpromising that would do today’s most egregious vaporware vendors proud. It was a “revolutionary” new battery that would “last longer than four or five automobiles.” Predictably, Edison’s fans in the press were enthralled. The nickel-iron battery “revolutionized the world of power.” The “age of stored electricity” had arrived.

  The giddiness didn’t last long. Soon, the batteries began to leak. Many of them quickly lost as much as 30 percent of their capacity. And so Edison recalled the batteries he had trumpeted so loudly, went back to the lab, and set out to finish what he called his “damned problem.”

  Five years passed. Edison’s health deteriorated. It was, according to Josephson, a “prevailingly somber period.” It was a grim few years for the electric car as well. The gasoline engine was improving quickly. In 1907, Rolls-Royce released a six-cylinder gas car, and Ford launched its affordable, popular Model N in 1906. The competition for Edison’s battery was growing tougher with each passing year.

  One of Edison’s employees solved the leakage problem with a rugged sealed container, but the performance still wasn’t what they hoped. Then in 1908, they had a breakthrough. The following year, Edison wrote in a letter: “At last the battery is finished.” In July 1909, he released the second-generation A cell.

  This battery was a success. It was nearly indestructible and had a longer life span than competitors, which made it particularly attractive to the owners of electric-truck fleets. Yet soon after the arrival of Edison’s A cell and ESB’s competing product, the Ironclad-Exide, Charles Kattering invented the automatic starter for gasoline engines, and that was effectively the end of the early electric passenger car. Before long ESB began adapting its lead-acid Exides for the subordinate duty of turning over an internal combustion engine. Edison’s battery found work running lamps and signals in mines, trains, and ships. In World War I, it was used for telegraphy and in submarines. For the next several decades, as the gas-powered car became an emblem of the American dream and the electric car went into a long hibernation, Edison’s battery and its competitors moved into supporting roles for a petroleum-driven
world.

  Back in 1908, two things rescued Edison’s battery. The first was the addition of nickel flake to the electrode. The second was lithium.

  In a patent application filed on May 10, 1907, Edison explained that adding two grams of lithium hydroxide to every 100 cc of electrolyte solution caused his battery’s capacity to spike by 10 percent and extended the amount of time the battery could hold a charge by a “remarkable” amount. Today we know that the lithium hydroxide most likely helped avert some detrimental, unintended chemical reactions that had been sapping away the battery’s strength. Edison, however, had no clue why it worked, and he probably didn’t care.

  Edison didn’t build anything resembling a true lithium battery. Lithium was the salt in his stew. But if nothing else, it was a poetic choice: a century later, after scientists have spent decades scouring the periodic table for better battery materials, we know that lithium is the best possible foundation for electrochemical energy storage. The universe hasn’t given us anything better.

  Lithium, which is now used for purposes as diverse as treating bipolar disorder and strengthening aircraft frames, is one of the three primordial elements, created during the first minutes after the big bang. The lithium atoms in our laptops and cell phones are among the oldest pieces of matter in the universe. Composed of three neutrons, three protons, and three electrons, lithium is the third element on the periodic table, preceded only by hydrogen and helium. A metal, it is half the density of water and, in its elemental form, too volatile to exist in nature. Pure lithium is silvery-white and soft, like cold Camembert cheese, and must be stored in oil to prevent it from reacting with air or water.

  Like its heavier alkali-metal cousins sodium and potassium, lithium was first isolated in the early nineteenth century. In 1800, a Brazilian chemist visiting a mine on the Swedish island of Utö discovered crystalline minerals he named spodumene and petalite, both of which we now know are compounds of aluminum, silicon, and lithium. Seventeen years later, Johan August Arfwedson, a young Swedish chemist working in the lab of Jöns Jacob Berzelius, broke petalite down into a lithium salt, which earned him credit as the discoverer of the element. Berzelius anointed the new mineral, which Arfwedson was never able to isolate in its pure form, “lithos,” from the Greek for “stone.”

  By the mid-1800s, lithium salts were being used medicinally, first to treat gout and, later, all manner of illnesses. Lithium therapy became popular in the late nineteenth century because of the spread of the idea that illnesses ranging from gout to asthma to depression were caused by uric-acid imbalances, and that lithium, by dissolving uric acid, could help with them all. Soon lithium salts and lithiated beverages, products with brand names like Buffalo Lithia Springs Water, were being sold widely as curatives. A brewery in Wisconsin made Lithia Beer using spring water that was high in the mineral. The lithiated drink with the most lasting influence arrived in 1929, with the name Bib-Label Lithiated Lemon-Lime Soda. The Howdy Company of St. Louis marketed the soda, which contained lithium citrate, as a hangover cure. “It takes the ouch out of grouch,” went an early slogan. Before long the company founder changed the drink’s name to 7-Up Lithiated Lemon-Lime, and today, we know its delithiated progeny as 7UP. (The latest ad campaign: “Ridiculously bubbly!”)

  Lithiated soda might have been dubious, but it was harmless. The next major medical application of lithium was far less benign. In the 1940s, some doctors began giving heart-disease patients lithium chloride as a substitute for their usual sodium-rich salt, and the result was a number of lithium overdoses, several deaths, and a wealth of data on how much lithium it takes to kill a human being. The timing was unfortunate. In 1949, the same year news of the lithium poisoning broke, the Australian psychiatrist John Cade reported dramatic results using safe doses of lithium salts to treat mania. Yet the toxic-overdose episode gave lithium such a bad reputation that the FDA wouldn’t approve lithium carbonate as a psychiatric medication until 1970.

  Lithium is now one of the most effective pharmaceuticals available for treating mental illness. Mood-stabilizing drugs such as Eskalith, Lithobid, Lithonate, and Lithotabs are indispensible for regulating bipolar disorder. Scientists still aren’t exactly sure how they work, but they do know that lithium affects neurotransmitters and cell signaling, and that it increases production of seratonin, the mood-elevating compound whose shortage is associated with depression. (Intriguingly, lithium also seems to stimulate brain-cell growth.) A study published in The British Journal of Psychiatry in 2009, which compared suicide rates and lithium levels in the drinking water of eighteen Japanese towns, found that “even very low levels of lithium in drinking water”—0.7 to 59 micrograms per liter, compared to the nearly 340 mg of elemental lithium delivered in the commonly prescribed 1,800 mg daily dose of pharmaceutical lithium carbonate—“may play a role in reducing suicide risk within the general population.” In an invited commentary piece published in the same issue, a Canadian psychiatrist suggested that lithium could one day be added to drinking water, just as fluoride is added to public water supplies to prevent dental disease. Right away the theory that government eugenicists wanted to exercise mass mind control by lithiating the water supply spread across paranoiac websites.

  Despite the significance of lithium as a psychiatric tool, the pharmaceutical industry absorbs only a tiny fraction of the approximately 120,000 metric tons of lithium-bearing compounds that are mined, processed, and sold each year. The largest share goes into metal alloys, ceramics, and lubricating greases, along with various rarefied applications—devices that absorb excess carbon dioxide in the air aboard spacecraft and submarines, rocket propellant, and certain types of nuclear reactors. Because we’ve stopped replacing the old ones, lithium no longer contributes to the manufacture of thermonuclear weapons. Isotopes of lithium did, however, trigger the largest thermonuclear device the United States ever detonated, the bomb that in the 1954 Castle Bravo test unleashed a blast twelve hundred times more powerful than what hit Hiroshima and Nagasaki, and dusted a swath of inhabited South Pacific islands with radioactive fallout.

  Of all of lithium’s uses, however, the one with the most profound implications for the future—the application that has already affected the lives of billions of cell-phone-, laptop-, and iPod-using people, and the one that stands to change the way we drive and to transform the way we use energy—is in batteries.

  Think of electricity as a stream of electrons. The ideal tool for storing electricity squeezes the largest number of electrons into the smallest and lightest device possible. But you can’t just shove loose electrons in a can. To get an electron, you have to pry it loose from an atom. In this way, every electron you get out of a battery comes with baggage in the form of protons and neutrons, both of which are more than eighteen hundred times as massive as an electron. In the lead-acid 12-volt battery under the hood of your car, each usable electron comes tethered to a hefty lead atom—82 protons and 125 neutrons in the nucleus, for a total atomic weight of 207.2. By contrast, each electron you snatch away from a lithium atom in your cell phone comes with a burden of only 3 protons and 4 neutrons; lithium has an atomic weight of 6.941, thirty times less than that of a lead atom.

  A lithium atom’s eagerness to shed its outer electron also means that it can be used as the basis for batteries that are more powerful and energy dense than those based on just about any other element. In essence, a battery is a high-energy chemical reaction that has been hijacked into providing useful results rather than a burst of flames. Lithium, recall, is too reactive to exist in nature in its pure form; combine the active ingredients of a lithium-ion battery’s two electrodes and, under the right conditions, you have an excellent high explosive. A battery, however, frustrates these violent tendencies. By putting an electrolyte bridge between those two electrodes, a battery keeps those bomb parts at a safe distance from each other, placing an explosion in suspended animation, creating a chemical system throbbing with energy that can be redirected and exploited.
/>   This system, used correctly, can help plug a gaping hole in our technological ecosystem—our pathetically primitive ability to store energy. As Bill Gates put it in a 2010 speech, all the batteries in the world can together store only ten minutes of our global electrical needs. In an era of grave concern about the future of energy, this is a fairly obscene weakness.

  Today we power our cars almost exclusively by burning the fossilized remains of prehistoric plankton, transforming the energy that holds those hydrocarbon molecules together into energy that moves us around town. And oil has many advantages: it’s powerful, versatile, and easy to store—we can simply put it in a barrel or a gas tank and let it sit. Yet oil’s many consequences (environmental degradation, greenhouse-gas emissions, the enrichment of dictators and sworn enemies of civilization), combined with the fact that we will eventually run out of affordable sources, make finding alternatives an obvious imperative.

  Of the alternatives, electricity is the cleanest and most flexible option. It’s piped into every home in the country. Mile by mile, it’s cheap compared with gasoline. It’s far more feasible than hydrogen, and in almost all circumstances it’s cleaner than ethanol. It can come from almost any source—natural gas, coal, nuclear, hydroelectric, solar, wind. Even when it is generated by a coal-burning power plant, it still produces less carbon dioxide per mile than a mile powered by gasoline.

  The problem is, electricity is hard to store, and that’s why the lithium-ion battery has attracted so much attention. It has already proved itself to be a powerful driver of modernity. Largely because of the arrival of the lithium-ion battery in the early 1990s, the cellular telephone first became ubiquitous and then transformed into a pocketable computer. Then it became a computer that connects wirelessly to the Internet. Then it became a computer, camera, MP3 player, GPS navigator, movie player, and all-around life planner and time waster, extending the reach of the information revolution into our pockets.

 

‹ Prev