Mad Science: The Nuclear Power Experiment
Page 3
Lochbaum D. US Nuclear Plants in the 21st Century: The Risk of a Lifetime. Cambridge, MA: Union of Concerned Scientists, May 2004.
Then, of course, there were accidents. The most famous of these occurred on March 28, 1979 at the Three Mile Island reactor #2 in central Pennsylvania. More than half of the core melted down after a series of human errors that compounded mechanical limitations. The crippled reactor never restarted, and cost Metropolitan Edison large amounts in legal fees. Reactor #1 at the plant, while not affected by the meltdown, closed and did not restart for nearly seven more years.
High construction costs, frequent shutdowns, and accidents primarily represented safety issues, but were also economic ones. Utilities were forced to take out loans much greater than originally expected, then pass along these costs over a long period of time to ratepayers. These large loans tied up money that would otherwise be used by companies for other development projects, or for making greater operational profits.
Utility companies weren’t the only ones having troubles with the high costs and safety issues posed by reactors. Banks that loaned money for new reactor construction became concerned as well. They could easily see that reactors cost much more than originally projected, took years before they began to produce revenue, and were often shut down for repairs. As a result, funds from Wall Street for new reactor construction began to dry up, and many reactors that had been ordered were cancelled by utility companies. The vision that nuclear reactors were “lemons” became predominant. New orders faded in the 1970s, and stopped completely in 1978. (Actually, according to the Nuclear Regulatory Commission, the last formal order submitted to the federal government for a new US nuclear reactor that was not subsequently cancelled took place in 1973.)The nation never reached more than 112 reactors operating at one time, and since 1998 the number has been stuck at 104 – a far cry from the 1972 prediction by President Richard Nixon’s Atomic Energy Commission that at the dawn of the twenty-first century there would be 1,200 reactors operating in the country. Over half the US reactors that were commissioned were either cancelled or halted before construction was completed (see table):
Source: US Nuclear Regulatory Commission (http://www.nrc.gov)
The performance of those reactors that did start up caused major safety problems. US nuclear reactors have generated about 66,000 metric tons of highlevel waste. This waste consists of slow-decaying radioactive fission products created to produce electricity, requiring long-term storage. As of early 2011, the US has no long-term plan for permanently storing the waste. The attempt to build a permanent repository under the ground at Yucca Mountain, Nevada, which received considerable criticism as soon as it was first proposed in 1982, was never completed, and funding was cut off by the Obama administration in 2010. Thus, all high level waste remains in temporary storage at each nuclear plant for the foreseeable future.
Another problem caused by reactors was emissions of radioactive chemicals into the local air and water. Some of these occurred during meltdowns like Three Mile Island. But even without a meltdown, every reactor must routinely emit a portion of the radioactive particles and gases it produces. Sometimes these releases are deliberate; for example, reactors must go through the “refueling” process about every twelve to eighteen months. Part of refueling includes the dumping of radioactive waste into local water.
The radioactive chemicals emitted by reactors enter human bodies by breathing and the food chain after entering local water supplies, vegetation, and animals. Government agencies have set “permissible” limits for emissions and levels in air, water, and food, and utility companies have monitoring systems to ensure that they are in compliance with the law. Government goes one step further to assume that legally-sanctioned emissions are harmless – without conducting any scientific studies. But for the past half century, the assumption that “permissible” doses pose no harm to humans has been questioned, and the battle continues today over whether these routine releases have increased disease rates in people living near nuclear plants.
The problems caused by reactors extend beyond those in operation. Twenty-three power reactors have closed permanently, along with a number of smaller research reactors. But shut down doesn’t mean that necessary work is over. A complex series of steps that constitute “decommissioning” of a nuclear plant to retire its parts and secure its waste is required by federal law, requiring many years and considerable costs to complete. Decommissioning can be a dirty process, posing health threats to workers and the local population. Similar to new reactor construction, decommissioning operations ran well beyond expectations, in terms of time needed and dollars required.
Creating nuclear power has also spawned problems aside from the operations of reactors. One is known as “reprocessing.” As high level radioactive waste began to pile up, government supported the concept of essentially recycling the waste so it could be reused as fuel for power reactors. The Atomic Energy Commission footed the bill for building a reprocessing plant at West Valley, just south of Buffalo, NY. But reprocessing was a total flop, a much dirtier process than anyone had envisioned. In 1972, after just six years of operation, reprocessing ceased at the West Valley site, leaving behind it a failed promise and a legacy of enormous contamination to the local environment.
Generating nuclear power is far from just a matter of what happens at reactors. Producing electricity at nuclear plants is the culmination of a series of steps, beginning with mining uranium, and then continuing with uranium milling, enrichment, and refining at specialty plants around the nation. Each of these processes is a dirty one, fraught with health risks to workers and local residents. These same steps were used in preparing uranium for nuclear weapons production, so adding nuclear power production increased the threat posed by these processes.
Finally, another concern posed by the development of nuclear power is linked to its research component. In the mid-1950s, ways to create nuclear power were still very much in the development phase. Scientists worked to refine nuclear power production methods at a series of research facilities across the US. Some of these research reactors were located at universities, while others were at plants operated by private companies hired by government officials eager to support efforts to improve nuclear energy generation.
After just a generation, the craze to develop nuclear power faded. With no orders to build new reactors and with the end of nuclear weapons production at the end of the Cold War, interest in the field waned. Fewer college students majored in nuclear engineering. Many of the research reactors at universities and operated by private companies shut down, as did some nuclear power reactors.
The wane of nuclear research didn’t occur without some major problems having occurred first. One of the sites designated to help develop ways to produce nuclear power was the Santa Susana Field Laboratory just outside of Los Angeles. Operated by Rocketdyne, a company chiefly concerned with developing liquid rocket engines, Santa Susana was home to ten small nuclear reactors, including the Sodium Reactor Experiment.
Things went terribly wrong at Santa Susana. The experiments resulted in four meltdowns in just several years, including the one in July 1959 that may well have been larger than any other in US history, including Three Mile Island. All Santa Susana reactors shut down by 1964, having failed in their task to find an advanced new method of producing nuclear power.
The mention of the Santa Susana experience brings up a corollary issue with nuclear power, beyond those of safety and economics. The industry, although supported generously with public tax dollars and other financial incentives, operates in secrecy. The general public, or even members of Congress, know very little about what transpires at atomic plants, especially on matters of environmental pollution and safety. This culture may be a carryover from top-secret nuclear weapons operations and/or a matter of not wanting to air “dirty laundry” to the public. If the dangers of atomic energy were revealed, the public might become alarmed and Congress might withdraw its generous support.
For decades, nuclear safety issues have been hidden or minimized by those in charge. Yet the secrecy and deception have actually fueled the struggle to understand the industry’s successes and failures. Over time, more people have become suspicious and challenged the rhetoric that nukes are “cheap and clean.” These challenges gained strength as the end of the Cold War reduced the need to support a growing nuclear weapons arsenal. Eventually, this wall of secrecy and deception began to give way – although it still has not completely disappeared – and nuclear power’s negative side has been more thoroughly revealed.
As the twenty-first century began, the growing concern over the threat of global warming gave pro-nuclear factions the chance to revive their product. Because reactors did not directly emit greenhouse gases to produce electricity, they were portrayed as “green” and beneficial to the environment. An attempt was made to order new reactors for the first time in a generation. Popular support was sought in a series of efforts, but many Americans were either against or skeptical about new reactors. Financial leaders had long memories, and Wall Street rebuffed all appeals for funding new reactor construction. Nuclear industry leaders went to Washington to ask for government’s help, bolstered by enormous lobbying and campaign contributions, but found the going very slow. After the disastrous meltdowns at Fukushima early in 2011 reminded the world of the dangers of the atom, the struggle to revive nuclear power became even more difficult.
Nuclear power in the United States has become a major industry. It has also provoked large-scale protests from concerned citizens. It has generated a lengthy and emotional debate among experts on just how safe (or unsafe) the technology is to current and future generations. It has generated much concern about financial viability among utilities that operate reactors; among Wall Street financiers; among public policy makers; and among American citizens paying utility bills. It has deeply involved government in what was originally intended to essentially be a private enterprise. Early in the twenty-first century, more than fifty years after Eisenhower’s speech and the resulting push to develop nuclear power for energy, nuclear power remains shrouded in controversy.
After an examination of the American experience with nuclear power, a fair assessment shows the negatives greatly outweigh the benefits. Problems of reliability, economics, and safety refute the early prophecies that nuclear power would prove to be cheap and clean. It will also examine the culture of secrecy and deception that has been practiced by those involved with the US nuclear power program since its inception in the 1950s and that continues today, and how the failure of nuclear power became more obvious as the barrier of secrecy was gradually dismantled, and the truth revealed.
Tiny Atoms, Huge Risks
Long before nuclear reactors and atomic bombs, radiation was an integral part of life on Earth. The planet contains considerable radiation that is not man-made, commonly referred to as background radiation, which can be categorized into three types:
Cosmic radiation: charged particles from outer space that exist in the Earth’s atmosphere, and enter human bodies through breathing. Levels of cosmic radiation are greatest in the troposphere, high above earth, which has raised concerns for airline pilots and flight attendants exposed to more of these rays than most other humans.
Terrestrial radiation: found in soil, rocks, water, air, and vegetation. Much of this type of radiation is found in the atoms uranium-238 and potassium-40. They decay very slowly, which limits their harmful effects. Each is taken up by the human body through breathing and the food chain.
Radiation is also found in the body’s tissues, through breathing and the food chain. Most of this type of radiation takes various forms of the elements carbon and potassium.
Thus, every human being is exposed to natural radiation on an ongoing basis. One footnote to this list of radiation includes the chemical radon-222, which is formed when radium-226 (a terrestrial metal which exists in rocks) decays. radon-222 is a gas, and although it decays and disappears quickly, it exposes humans to perhaps as much radiation as the three categories of natural radiation combined. A common practice in the US is for homeowners to measure levels of radon in the soil surrounding their home, and take remedial action when levels are excessively high.
Natural radiation poses health risks, regardless of whether it is cosmic, terrestrial, or in the body. However, a distinction needs to be made between types of radiation known as ionizing and non-ionizing. In all atoms, protons and neutrons make up the nucleus with electrons circling the atom. In a non-radioactive or stable atom, there is a balance (similar number) of protons and electrons. For example, hydrogen has one proton and one electron each. Even in radioactive but non-ionizing atoms, the number of electrons and neutrons are the same. radium-226 has 138 neutrons, 88 protons, and 88 electrons. But in radon-222 (a decay product of radium-226) there are 126 neutrons, 86 protons, and 88 electrons – the other protons from the original radium-226 that decayed having become natural helium atoms. radon-222 is the product of ionizing radiation, defined as that with enough power to knock an electron from its normal orbit around the atomic nucleus. This imbalance is important in understanding the especially destructive properties of man-made ionizing radiation from atomic weapons tests and nuclear reactors, described later in this chapter.
While stable atoms remain in the earth’s environment, radioactive atoms disappear. Ionizing radiation is not a measure of mass, but a measure of the degree to which the radioactive atoms decay. Each type of radioactive chemical, or isotope, decays at a different speed, measured by a concept known as “half life” or the amount of time needed for half of the chemical to decay. radon-222 has a half life of four days. In other words, 50% of a quantity of radon-222 will remain in four days; 25% will remain in eight days; 12.5% will remain in twelve days, etc. A general consensus is that a radioactive chemical essentially disappears after about ten half lives. Radioactive chemicals have half lives that can be measured in seconds, while those of others are billions of years.
Radiation can be categorized into five types of exposures. Alpha radiation is found in fast-traveling particles that can be easily stopped by various types of matter – including human skin that stops airborne alpha radiation. Beta radiation is also found in fast-moving particles (even faster than alpha), that can penetrate through air, into skin and other human tissues. Neutron emissions are a powerful form of radiation created when the atomic nucleus is struck by a particle and fired from the atom. X-rays are products of fast moving electrons that can create pictures of tissues and bones on film, and race completely through the body. Finally, gamma radiation represents waves of energy similar to X-rays that can penetrate matter such as the human body. Humans can be harmed from exposures to all of these.
Humans first used natural radioactive products several hundred years ago, as miners in what is now Germany and the Czech Republic dug for uranium in the ground. Uranium was first mined in the western US in the late 1800s. Today, about three-fourths of the world’s mined uranium is obtained (in relatively equal proportions) from the former Soviet Union, Canada, and Australia, with the remaining one-fourth distributed among a number of nations, including the US.
The beginning of man-made radiation can be traced to November 8, 1895, with the discovery of X-rays. German scientist Wilhelm Roentgen was conducting experiments in his laboratory at the University of Wurzburg using cathode ray tubes. He observed a faint light on a bench well outside the tube, which came as a great surprise. Roentgen later noted: “I have discovered something interesting, but I do not know whether or not my observations are correct.”
Subsequent experiments by Roentgen found that he had hit upon a new phenomenon not visible to the human eye that could penetrate not just the tube cover, but all substances – with the exception of lead. The new ray did not resemble anything that had been identified previously, so the term “X-ray” was applied – a term that is used universally to this day.
Roentgen’s paper, submitted the following month to a j
ournal, was noticed by other scientists. The following February, Paris physicist Henri Becquerel built on Roentgen’s discovery by showing that the presence of uranium gave the rays their ability to penetrate matter. Quickly, the concept of radioactivity had been established. Two years later, Marie Curie announced that the elements radium and polonium also held the same radioactive properties as did uranium.
Speedily, scientists envisioned multiple uses for the new technology. The primary purpose was medical, as human bones and tissues could literally be photographed to help doctors diagnose patient ailments. The first medical X-ray in the US was taken at Dartmouth College by brothers Gilman and Edwin Frost, who captured an image of a man with a broken wrist, just two months after Roentgen’s discovery. Manufacturers developed machines that soon became more sophisticated, and were mass produced. By the 1920s, X-rays were used to diagnose ailments of the brain, digestive tract, kidney, and bladder, as well as bones. All hospitals and many physician offices featured the new machines, which truly revolutionized the practice of medicine. Attempts were also under way to use the new technology to treat disease, not just diagnose it.
Despite all the attention that the benefits of X-rays were getting, the medical profession in these early years took little note of their health risks. A number of anecdotes indicate that scientists knew there was a risk from the very beginning. Becquerel, after carrying a tube containing radium in his pocket for a short time, found that the skin under the pocket became raw and irritated – and returned to normal after he removed the tube. Salesmen and radiologists died from radiation-related conditions, which raised concern even though studies to understand the extent of the damage were slow in the making.