The Age of Radiance

Home > Other > The Age of Radiance > Page 49
The Age of Radiance Page 49

by Craig Nelson


  On June 26, 1995, the Hahn backyard was declared so toxic it required an EPA Superfund cleanup squad. David’s story made him a legend and led to a book, The Radioactive Boy Scout. One of that book’s biggest fans was eleven-year-old Taylor Wilson, who read the entire thing to himself out loud. “Know what?” Taylor told his parents. “The things that kid was trying to do, I’m pretty sure I can actually do them.” Taylor then spent much of his allowance that year on a radioactive collectible—a Fiesta dinnerware set too hot to eat from—and began experimenting. The family worried they might be facing a suburban Chernobyl, like the Hahns. “The explosions in the backyard were getting to be a bit much,” Taylor’s half sister Ashlee remembered.

  Taylor then decided to try for the ultimate nuclear dream: Starlight on Earth. If Eisenhower’s Atoms for Peace helped transform the Hiroshima fission bomb into nuclear power, what would happen if you tried doing the same with the Teller-Ulam fusion of Mike?

  A sun lies inside the sun, the core 10 percent where all solar power originates. Every second, that core, with a force equal to 96 million thermonuclear bombs, transmutes 4 million tons of matter into 385 million million million million watts—the light and the heat that sustains life. In 1967, Hans Bethe won his Nobel for working this out mathematically with “Energy Production in Stars,” and for over fifty years, scientists have been trying to create starlight on earth, to engineer a controlled fusion nuclear reactor.

  The teenaged Taylor Wilson was struck by the byzantine problem of radioactive oncology. Medical isotopes used for both diagnosing and treating various cancers have to be short-lived, to kill the bad cells without inflicting too much damage on the good, but this makes their distribution time-sensitive and tremendously costly. Instead of shipping isotopes to patients by private jet, Taylor thought, what if a reactor could be made small enough and safe enough to produce isotopes right in the hospital? With the help of others legally old enough to drink beer, Taylor at the age of fourteen built a reactor bombarding atoms into each other in a shimmering 500-million-degree plasma (not solid, not liquid, and not gas, but a gaslike state that can be magnetically charged into filaments and beams, best known as the inner glow of neon bulbs). The commercial versions cost less than $100,000 and can be rolled right into the patient’s room.

  But this is one of the few happy endings in a science that has been promising results since 1955 and achieving as much as Ed Teller’s Strategic Defense Initiative. Since 1993, Lawrence Livermore has spent over $5 billion on the National Ignition Facility, a stadium-size laser designed to generate power from fusion. As this book was going to press, the NIF announced a breakthrough: it had finally created a fusion reaction that generated more power than it took to initiate. But it is still a long road from that step to the reaction creating enough energy to sustain itself into a source of fusion power. Scientists in this business like to joke that “fusion is always twenty years away,” since that’s what they’ve been saying for the industry’s entire lifetime. Before trying to ignite fusion with lasers, physicists all over the world tried goliath magnets. At their 1985 Geneva summit, Reagan and Gorbachev agreed to merge their fusion energy R&D programs with France’s and Japan’s to create the International Thermonuclear Experimental Reactor (ITER), which, thirty years later, is expected to take another $30 billion and twenty years to work. If it ever does.

  When we hear that a nuclear plant has collapsed in catastrophic meltdown, we can’t help but imagine a China Syndrome, with a whole population infested with tumors, a vast territory rendered into nuclear desert, and offspring afflicted with never-before-seen birth defects. Beyond the heroic martyrdoms of power plant and emergency response workers, though, what actually happens after a nuclear plant disaster is so minor compared to our mythic fantasies that it is almost impossible to understand.

  Created after the bombing of Nagasaki to study radiation’s long-term biological effects, the United Nations Scientific Committee on the Effects of Atomic Radiation (UNSCEAR) worked for twenty-five years with the Chernobyl Forum (a joint effort of such UN agencies as the International Atomic Energy Agency, the World Bank, the World Health Organization, and the governments of Ukraine, Belarus, and Russia) to catalog the Lenin Station’s aftereffects. They concluded that fifty-seven people died during the accident itself, including twenty-eight emergency workers, and that from 1986 to 2002, about 6,848 neighboring children were diagnosed with thyroid cancer from drinking the radioactive-iodine milk of tainted cows. This latter tragedy could easily have been prevented with a working public health service in a normal functioning state—Chernobyl fallout has been detected in dairy products as far away as Oak Ridge, Tennessee—but when it was discovered, the USSR was collapsing. Eighteen of those children have died.

  Beyond that, there is no clear medical evidence of Chernobyl’s adverse impact on human beings, either in cancer rates or mortality rates or nonmalignant disorders . . . and Chernobyl was an accident far worse than nearly anything that could happen anywhere else in the world, as the Soviets did not use a significant containment dome, and their atomic fire, raging for two weeks, covered almost the whole of Europe in a radioactive cloud equivalent to four hundred Hiroshimas. The worst nuclear disaster in human history, then, turned out to be far less catastrophic than such other industrial horrors as the August 8, 1975, Banqiao Dam failure in China, which killed 171,000, or the December 2, 1984, Union Carbide pesticide plant leak outside Bhopal, India, which killed 3,787 and injured 558,125. UNSCEAR’s report concluded, “There has been no persuasive evidence of any other health effect in the general population that can be attributed to radiation exposure,” and anyone living there now “need not live in fear of serious health consequences from the Chernobyl accident.”

  A number of people who have closely studied this tragedy are convinced that UNSCEAR and the Chernobyl Forum have grossly underestimated the effects of the disaster, but they have nowhere near the underlying research data to prove it, and considering the social chaos brought by the fall of the Soviet Union, it may be impossible to ever get it. In one example, physicist Bernard Cohen estimated, “The sum of exposures [from Chernobyl] to people all over the world will eventually, after about fifty years, reach 60 billion millirems, enough to cause about sixteen thousand deaths.” Even if this were true, every year in the United States, around sixteen thousand people die just from the air pollution of coal-burning power plants.

  So what will become of the Japanese people dosed by Fukushima’s radioactive clouds? Beyond the UN’s Chernobyl efforts, concrete scientific data on the health of human beings exposed to radiation is thin. The most solid and extensive studies arose from the hibakusha, the two hundred thousand survivors of Hiroshima and Nagasaki, whose health has been followed for over sixty years by a joint US-Japan effort, the Radiation Effects Research Foundation.

  The expectations at the start of that study were the expectations you might have—that the hibakusha would be grossly overrun with tumors, that their genes would mutate, that their descendants might be deformed. Instead, the results were startling. The foundation’s Evan Douple said that they had predicted cancer would be widespread but, in fact, “the risk of cancer is quite low, lower than what the public might expect.” Ninety-eight of 120,000 in one study group had died of leukemia attributable to radiation, while 850 of 100,000 in another had died from solid tumors—and these were not people who had to evacuate from a nuclear plant exhausting meltdown waste; these were people attacked with an atomic bomb.

  Radiologist John Moulder analyzed the data from another control group: “Of those fifty thousand people, about five thousand of them developed cancer. Based on what we know of the rest of the Japanese population, you would have expected about forty-five hundred of them. So we have five thousand cancers over fifty years where we would expect forty-five hundred. So we assume that those extra five hundred cancers were induced by the radiation.” Five hundred cancers out of a fifty thousand population means a rate of 1 percent.

&nbs
p; Additionally, despite the remarkable findings with fruit flies in the 1950s, there was no increase in inherited mutations. Hibakusha children, grandchildren, and great-grandchildren have all turned out just fine. As of 2011—sixty-six years later—40 percent of the Hiroshima and Nagasaki survivors were still alive.

  For Fukushima, then, the consensus is a 1 percent increase in cancer for TEPCO employees who worked at the site, and an undetectable increase for Daiichi’s citizen neighbors. Statistically, the rates for death and injury for real estate agents and stockbrokers remain higher than those for atomic plant workers. Even Daiichi’s horrific ocean flushing of toxic runoff, as sorrowful as it was, was minor compared to the Cold War radioactive waste dumped in thousands of now-rusting drums between 1946 and 1994. The Soviet Union alone is responsible for at least 2.5 million curies of ocean dumping, while America since 1946 has thrown 47,500 fifty-five-gallon drums into the waters of the Farallon Islands off San Francisco. In one of the few studied incidents, some of these have burst open, infecting local sponges with plutonium.

  If this is the truth of nuclear plant meltdowns, then why, for everyone outside Japan, did the drama of Fukushima Daiichi so completely overshadow the incomprehensible tragedy of 3.11?

  Back in 1945, the world was introduced to nuclear power through images and news reports from Hiroshima and Nagasaki, and no matter how distant the technology inside your local utility’s containment dome is from Los Alamos and Edward Teller’s blackboard, those resonant images live on in our memories. Physicist James Mahaffey: “As they say in nuclear engineering circles, if the first use of gasoline had been to make napalm, we’d all be driving electric cars now.” Nuclear’s invisible powers, mythic history, and scientific mysteries add up to inspire in the general public a belief in magic—black magic. Reactors are exotic and strange, the stuff of fantasy and science fiction, mainstays of popular culture—which radiologist Fred Mettler thinks has bequeathed the public with “radiation biology lessons.” When it comes to the science of radiance, Mettler said, “Children in the United States are inundated with all kinds of nonsense on television from the time they are six months old.” Today, especially in the United States, nuclear is synonymous with evil. From Meryl Streep, scrubbed raw and naked and then murdered in Silkwood, to supernaturally incompetent Homer working for the villainous owner of a nuclear power plant on The Simpsons, atomic power is ominous and ever threatening.

  It’s hard to know which would horrify the nuclear science laureates profiled in this book more: their degraded reputations, or that their hard-earned science had been transformed into a collection of myths.

  17

  Under the Thrall of a Two-Faced God

  THE history of the nuclear dot on the letter i in a newspaper story read by a passenger on the atomic bus seems to endlessly repeat the two-faced miracles of X-rays and radium. An epoch that began in a poor man’s version of Frankenstein’s lab trying to explain what voltage does in vacuum tubes is ending with the real-world civil and military Frankensteins of utility meltdowns and nuclear arsenals. As the age of radiance draws to its close, its history is one of paradox, ambiguity, absurdity, and blessings with menace on a global scale.

  When I went online to find analogies for “two-faced miracle,” all I got was a mutant kitten. Then I remembered Janus, the lord of beginnings and transitions, gates and doors, Pollyannas and Cassandras, the past and the future, of harvests, marriages, births, change, and of time, a god so important Rome’s eleventh month was named in his honor: January. Janus in his two-faced glory reveals the beauty and reality of all our mixed feelings about our up-and-down atomic legacy. To use one analogy unknown by Google, when great writers engineer complicated, engaging characters, their men and women in various ways are evocative matrices of weak and strong, brave and scared, sacred and profane, admirable and despicable—just as we all are in real life. Perhaps the same two-sided beautiful Janus model needs to inform our thinking when the subject is nuclear, similar to Niels Bohr’s nostrum about big truths: “You can recognize a small truth because its opposite is a falsehood. The opposite of a great truth is another truth.”

  At the dawn of the Atomic Age, physicists were elevated in the public mind to the role of secular priests, their study of subatomic particles appearing to lead them, simultaneously, to spiritual and moral truths. Notably, Albert Einstein and Marie Curie became heroes to millions around the world, role models for a new era. Along with many civilians, E. O. Lawrence, Enrico Fermi, and both generations of Curies believed that their scientific discoveries would inexorably lead to benefits for all humankind. Instead, that sweet hope, along with their current reputations, has been battered by a history of thermonuclear-winter terrors and run-amok power plants. Following the dropping of the atomic bombs on Hiroshima and Nagasaki, the great pacifist Albert Einstein, who had nothing to do with fission beyond his letters to Roosevelt, was depicted on the cover of Time magazine against a mushroom cloud and his most beloved equation, E = mc2, while Mme. Curie is today known for her breakthroughs as a woman, not as a scientist. The public’s distaste has grown so pronounced that European physicists created a PR organization, Public Awareness of Nuclear Science, to fix their image problem. Today’s prejudice against all things atomic is as naive as was the 1920s radium euphoria and the 1950s techno-utopians predicting nuclear-derived electricity as “too cheap to meter.”

  In the case of nuclear power, most of the world has come to a decision. What Three Mile Island, Chernobyl, and Fukushima all have in common is that the public relations disaster was far worse than the pollution’s health effects. In each incident and each nation postdisaster, one common casualty was the truth, at the hands of the nuclear industry; the local and national governments; and even the antinuclear activists. A founder of Physicians for Social Responsibility, Helen Caldicott, wrote in the New York Times that in the aftermath of Chernobyl “almost one million people have already perished from cancer and other diseases. The high doses of radiation caused so many miscarriages that we will never know the number of genetically damaged fetuses that did not come to term. (And both Belarus and Ukraine have group homes full of deformed children.)” Based on questionable and roundly criticized papers coming out of Eastern Europe, her assertions seem as threat inflating as anything Teller told Reagan.

  Global Fission author Jim Falk: “People have come not only to distrust the safety of the technology but also the authority of those who have assured them so confidently that nuclear power is safe. In this sense people distrust the entire nuclear enterprise—not only its technology, but the public and private organizations, the political parties, and those often prestigious scientists who advocate and assist in the development of nuclear power.” With nuclear power calamities, we have learned through harrowing experience that you can’t trust the government, you can’t trust the industry, and you can’t trust the critics . . . or even your own fears. One psychiatrist, Robert DuPont, has spent years studying radiophobia: “On all four counts, nuclear power generates fear. It’s a cataclysmic accident that people are concerned about. It’s controlled by ‘them,’ the utilities or the government or the scientists or whoever it is that is perceived as the bad guys. It’s unfamiliar to most people, and most people feel they don’t really need nuclear power, that they can get their power from coal or oil or windmills or some other basis.”

  Physicist James Mahaffey: “Just the naked word radiation is enough to make us uncomfortable. . . . You can just be standing there, feeling nothing unusual, while being killed by it, never mind being actively hit with the meltdown or bomb. A major component of the paradox of nuclear power is that far more people die each year of radiation-induced disease from standing out in the sun than have ever died from the application of nuclear power.” Probably only one other word is as loaded with freight as nuclear: cancer.

  The atomic utility industry and its governmental affiliates have done such a poor job of both educating the general public and managing their crises that they will b
e driven out of business. Atomic utilities now require state-funded corporate welfare to build their plants, to insure them, and to nationalize them when disaster strikes. How many politicians can afford to bankroll reactors at every stage? How many parents want a burning radioactive pile anywhere near their young children? Unless some dramatic technological breakthroughs completely rework public opinion, not in your lifetime and probably not in your children’s lifetime but eventually, nuclear power will become so insignificant that it will be essentially meaningless. Certain countries, such as France and parts of the developing world, will continue with nuclear power, and we need some reactors to make radioisotopes. But otherwise, as politics and as business, nuclear power has stopped making sense.

  The same fate awaits nuclear arms, which like power plants won’t vanish entirely but will fade into insignificance. Consider the facts: The United States and the USSR spent $5.5 trillion on 125,000 nuclear weapons for the Cold War, with America alone throwing away an average of $35 billion a year. Yet having the Bomb did nothing to help Russia with its troubles in Afghanistan, or America with its nightmares in Korea and Vietnam . . . or for that matter, Britain with Suez, Israel with the Arabs, or France with Algeria. A Soviet general said after the fall of the USSR, “Hundreds of billions were spent to counterbalance the mutual fear of a sudden nuclear strike when—as we now know—neither side ever conceived of such a strategy because it knew what horrors it would visit on both.” Nearly every politician in charge of the codes—every president and premier since Eisenhower and Khrushchev—has admitted after leaving office that only a sociopath or a madman would launch atomic arms, even in retaliation for a nuclear first strike. The men who had their fingers on the nuclear button for all those decades never even came close to using what their $5.5 trillion had bought.

 

‹ Prev