In early twentieth century America, standards for medical products were far looser than they are today. It was almost inevitable that some would capitalize on the euphoria over the fascinating new technology, and peddle it as health-promoting. This idea, although well-intentioned, was beset by exaggerated claims. Probably the most common of these new-fangled ideas was the effort to promote radium as an elixir to all health problems and even a boost to good health. The chemical appeared in tonics, hair restorers, toothpastes, and even “Radium Water.” For a while, plenty of people – including sick ones who were looking to recover their health – fell for these false and dangerous claims. They used radium to improve their health, only to become sicker and to hasten death. The health toll from use of these products will never be fully known.
Industries other than medicine also quickly began using X-rays for various purposes. One of the more common such uses was in shoe stores, which employed a device known as a fluoroscope that was actually a continual stream of radiation, versus a single shot X-ray. Fluoroscopes, which were also used by doctors and hospitals, were installed in thousands of US shoe stores in the 1930s, 1940s, and 1950s to help the shoe clerk find any unusual malformations of the foot. People, often children, were exposed to X-rays in these machines for an average of twenty seconds, but sometimes much longer. Doses were relatively high, especially for children, but early advertisements played up benefits of the “Shoe Fitting Fluoroscope”:
Guard their foot health carefully through carefully fitted shoes. To help insure better fit, leading shoe stores use the ADRIAN X-ray Machine. Whether the shoe clerk is an “old-timer” with twenty or more years of fitting experience or a “Saturday extra” who has been on the job only a few weeks, ADRIAN X-ray Machines help him give your child the most accurate fitting possible.
After decades of use, claims and studies about the hazards of fluoroscopes started to pile up. Medical groups began to set standards for using shoe-fitting fluoroscopes, and subsequently, states began to ban their use, particularly since they provided little benefit – foot malformations could be detected without these machines. The practice was essentially ended in the US by 1960.
Little attention was paid to health hazards of X-rays from industrial uses in the early twentieth century, but anecdotal evidence shows that there were indeed casualties. Probably the best-known and most blatant of these instances was that of workers who painted dials of watches with radium for the US Radium Corporation in New Jersey during the 1920s. Workers routinely licked the tips of the brushes to keep them pointed to make their work more precise, even though the brushes contained radium. Both employers and employees assumed that the amount of radium on the brushes was so small that it could not possibly cause harm to humans.
As time went on, dial painters began suffering and dying from a variety of illnesses involving the teeth, mouth, jaw, and other affected organs. The company contended that the health problems were a matter of substandard dental hygiene, refusing to admit that radium ingestion played any role. Some of the employees and their estates sued the company, but lost. They were set to appeal to the US Supreme Court when the company settled the suits to avoid admitting any wrongdoing.
Continued scientific experimentation with radiation led to the Manhattan Project, an all-out effort sponsored by the US government to develop the first atomic bomb. In December 1942, a team led by Enrico Fermi succeeded in creating a “chain reaction” of neutrons; when targeted to uranium atoms, this chain reaction could conceivably create a weapon far more powerful than any in human history. The perceived threat that scientists in Nazi Germany were making the same attempt to develop a similar weapon gave the project an urgency that resulted in a successfully tested bomb in the desert of New Mexico less than three years after the project began. Almost immediately after the initial test, bombs were used on civilian targets in the Japanese cities of Hiroshima and Nagasaki, resulting in enormous casualties, estimated at about 210,000 excess deaths within four months and about 350,000 excess deaths within five years.
The initial atomic bombs were based on a process known as nuclear fission. This was only possible after assembling enough atoms of uranium-235 to produce a large scale weapon. U-235 is actually a small proportion of natural uranium (most is U-238); after mining the mineral, an extensive process of milling, conversion, enrichment, and fabrication, explained later in this chapter, was needed to create adequate amounts of the U-235 that is so critical to nuclear weapons.
Fission involves the splitting of U-235 atoms by bombarding them with neutrons (some atomic bombs use the splitting of plutonium-239/240 atoms). As uranium atoms split, neutrons strike other U-235 atoms, causing a chain reaction in which extremely high heat is created. Breaking U-235 atoms apart also creates several hundred new chemicals, known as fission and activation products. They are not found in nature, but formed by the rearrangement of protons, neutrons, and electrons from the old U-235 atoms. Some of these chemicals have become well known during the atomic era of the past sixty-five years, including iodine-131, cesium-137, and strontium-90.
The horrors of Hiroshima and Nagasaki are well documented. Many people closest to the blasts were vaporized, literally melted, by the bombs. But a considerable portion of the damage to the Japanese victims was a direct result of exposure to the extremely high heat, and the resulting fires caused by the heat. The many fission and activation products created by the blasts also entered the environments of the two cities, but played a secondary role. However, because no atomic weapons have been used on human targets since 1945 (only tested in remote locations), the greater actual danger since then has been posed by the hundreds of fission and activation products formed when atomic weapons are exploded.
Fission products are radioactive forms of non-radioactive elements found in nature. For example, iodine is one of the elements found in the periodic table of the elements in any basic chemistry book. iodine-127 is a natural form of the chemical that is stable and non-radioactive. The element is a key to the healthy development of the thyroid gland, a small butterfly-shaped organ wrapped around the throat. The thyroid is a kind of command center for physical and mental development, especially in infancy and childhood; hormones produced in the gland are critical to healthy development. Since the 1980s, every American newborn has been screened for levels of thyroxine, one of the crucial hormones produced in the thyroid gland. Any baby who has low levels of the hormone is immediately placed on an artificial dose of thyroxine, to prevent conditions including dwarfism and mental retardation.
There are thirty-seven radioactive forms of iodine, including a number that are formed when U-235 atoms are split, when an atomic bomb explodes or when a nuclear reactor operates. One of these is I-131, which has four more neutrons than the stable I-127. I-131 particles enter air, water, and most importantly the food chain. One common way that Americans ingested I-131 from atomic bomb tests above the ground is by drinking milk contaminated when cows grazed on grass where the chemical had settled. Once in the body, I-131 particles quickly make their way through the stomach to the bloodstream, and seek out the thyroid gland. They attack tissues by emitting harmful beta particles, destroying and injuring healthy cells, and reducing levels of important thyroid hormones. Exposure to I-131 and other radioactive forms of iodine has been linked in medical studies with higher risk of thyroid cancer, benign thyroid growths, hyperthyroidism (excess hormone levels), and hypothyroidism (lack of adequate hormone levels).
Ironically, I-131 is so effective in killing thyroid cells that it is used as a treatment against thyroid cancer and hyperthyroidism. The chemical has a half life of 8.05 days, which means that it only lasts several months before disappearing. I-131 became part of the American vernacular in the 1950s when fallout from atomic bomb tests above the ground in Nevada swept across the continental US, and contaminated the food chain nationwide. Government officials found very high levels of the chemical in Utah milk, although no orders were ever given to destroy milk. I-131 was als
o the type of chemical that was released in greatest amounts by the 1986 meltdown at the Chernobyl nuclear plant in the former Soviet Union.
Another element of particular interest in understanding man-made radiation effects is strontium. This element is a metal that is chemically similar to calcium, and thus its non-radioactive forms can be helpful in stimulating bone and tooth development. There are four stable, non-radioactive forms of strontium (Sr-84, Sr-86, Sr-87, and Sr-88), and nine radioactive forms, one of which is Sr-90. This isotope is created after nuclear weapons tests and nuclear reactor operations, and is released into the environment in the form of particles. Sr-90 enters the human body through the food chain; over half of this is through milk, with the rest from meat, vegetation, wheat products, and water.
Sr-90 particles behave like calcium once it enters the body. It quickly moves from the stomach to the bloodstream, and seeks out bone and teeth. As all radioactive chemicals do, Sr-90 behaves like a “wild bull in a china shop” by firing dangerous beta particles to damage and destroy healthy cells. The chemical can also penetrate into the bone marrow at the center of bones, which is a critical part of the human anatomy; the white and red blood cells that form the “army” of the immune system develop in the marrow. Thus, exposure to Sr-90 is not just a risk factor for bone cancer, but for leukemia and all other types of cancer. Its power has been captured by modern medicine as a treatment for bone cancer, due to its ability to quickly kill fast-multiplying cancerous cells in the bone.
Unlike I-131, Sr-90 decays slowly, with a physical half life of 28.7 years. The body gets rid of it more quickly (biological half life), but still, the chemical remains in the body for years. And unlike I-131, Sr-90 does not disappear entirely. Instead, it does what numerous radioactive isotopes do; it decays into another chemical known as a “daughter product” called yttrium-90. Y-90 is also radioactive, and seeks out the pituitary gland at the base of the brain. Because the pituitary gland is important for various brain functions, the health threat of Sr-90 is multiplied, even though the half life of Y-90 is just 2.7 days. A list of radioactive forms of Strontium, with their half lives, follows; the half life of Sr-90 is 28.7 years, but all others are less than sixty-five days (some measured in minutes or hours):
Sources: US Nuclear Regulatory Commission: Radionuclides (10CFR Part 20, Appendix B. H (list of isotopes). Holden NE. Table of the Isotopes. In Lide DR. CRC Handbook of Chemistry and Physics (85th Edition). CRC Press, 2004.
In the 1950s and 1960s, as aboveground atomic weapons tests continued, Sr-90 became a household word in the US Scientists identified it as one of the most deadly of the several hundred fission and activation products. As far back as the early 1940s, the American team developing the first atomic bomb considered a contingency plan if the bomb could not be successfully exploded. The plan was to drop large quantities of Sr-90 from airplanes over German cities, so that it would infiltrate the water supply and food system, causing great harm to humans. A decade later, as Sr-90 levels in the food supply and human bodies increased, so did concern. Many leaders, including the 1956 Democratic presidential candidate Adlai Stevenson, spoke out on the horrors of this exceptional poison:
This radioactive fall-out, as it is called, carries something that’s called strontium-90, which is the most dreadful poison in the world. For only one tablespoon equally shared by all the members of the human race could produce a dangerous level of radioactivity in the bones of every individual. In sufficient concentrations it can cause bone cancer and dangerously affect the reproductive processes.
The reference to reproductive processes is a key component in understanding radiation health risk. When radioactivity attacks the cell, it breaks the cell’s membrane, and can enter the cell nucleus, where DNA that dictates the human genetic code resides. Radioactivity is capable of breaking DNA bands; sometimes, these breaks can be repaired by the body, but other times the damage is permanent. Damaged DNA thus can be transferred to future generations during the reproductive process, something that not all pollutants are capable of. This type of damage was not just a theory, but a universally agreed upon scientific principle. As far back as 1927, biologist and future Nobel Prize winner Hermann Muller published a pioneering paper that first documented that X-ray exposure to fruit flies increased genetic mutations not just in the irradiated flies but in the succeeding generation, a finding subsequently duplicated by other experts. In 1947, soon after the atomic bomb was developed, British geneticist J.B.S. Haldane told a conference that the greater danger of atomic weapons was the genetic damage it passed down to future generations: “The killing of ten percent of humanity by an attack with atomic bombs might not destroy civilization. But the production of abnormalities in ten percent of the population by gene mutations induced by radioactivity may very easily destroy it.”
A final aspect of the threat posed by Sr-90 and other man-made radioactive chemicals is the much greater risk to the fetus and infant, compared to adults – a concept true of all pollutants. Nobel Peace Prize winner Dr. Albert Schweitzer made note of the vulnerability of young people, while singling out Sr-90 as an especially dangerous poison, in a 1957 broadcast “A Declaration of Conscience” that called for an end to aboveground atom bomb testing:
Strontium-90 is particularly dangerous and… present in large amounts in the radioactive dust… To the profound damage of these cells corresponds a profound damage to our descendants. It consists in stillbirths and in the births of babies with physical or mental defects.
From the earliest years of the twentieth century, scientific experts knew that radiation exposure carried health consequences. The more important question, however, was just how dangerous this relatively new technology was. In more specific terms, how many humans would be harmed and killed from exposure, especially certain categories of people at certain doses?
This question has proved difficult to answer, for varying reasons.
It takes considerable time to plan, conduct, and review the epidemiological and medical studies to calculate doses and risks of radiation exposure. Animal studies were conducted beginning in the early years of the atomic era, but they were often fed high doses of radiation, sometimes rubbed on their skin. Thus, the results are not always transferable to humans at these doses.
There was a built-in tendency to emphasize the positive aspect of radiation. The technology was revolutionary and exciting. Physicians could literally take pictures as if they were inside the human body. Diseases could potentially be cured. Industry could do many things with radiation, such as making wristwatches glow in the dark. The ability of radiation to offer multiple benefits to society made it almost impossible to resist using it, even before a good understanding of health risks was achieved.
Radiation takes multiple forms, and thus understanding its health risks is not a simple, one-dimensional concept. Radiation can include X-rays, natural radioactivity, and man-made fission products. Man-made radioactivity comprises hundreds of isotopes, making it difficult to understand relative contributions to disease. Types of exposure can vary; some are quick, single exposures such as X-rays, while others are protracted exposures, like fluoroscopy. Doses are often difficult to measure accurately. Some exposures are external (like X-rays) and some internal (ingested in food and water). Risks also vary according to characteristics of those exposed; for example, the fetus, infant, and child have a much greater sensitivity than an adult to the same dose.
Radiation effects may take years to be diagnosed after exposure, and thus it can take many years to achieve a true understanding of radiation risk, even if research is diligent and well organized. Some effects, such as child cancer risk after a fetus is irradiated, can be documented in the short term. But other effects may take years, even decades, after exposure. Actually, because human DNA is damaged by radiation, the true effects may only be known when future generations are assessed by research.
Economic incentives also hinder the development of comprehensive research and understanding of radiatio
n risk. As mentioned, physicians and various industries made extensive use of the new technology; any evidence of resulting harm might result in product liability or other economic setbacks. In the case of medical uses of radiation, the expert researchers are the very users of the technology, thus setting up a difficult situation in which the same group using a technology must also monitor its safety, which is difficult for humans to do objectively.
The use of nuclear technology for military purposes set up a blockade to a good understanding of radiation risk. Military strategists might be willing to be truthful about the atom bomb’s destructive power, but are not likely to admit risks of creating or testing such a bomb, as doing so might slow down progress in building the desired nuclear arsenal. In World War II and the Cold War, during which the bomb was born, national security was paramount, and the issue of conducting health research was relegated to a back seat – even denied.
The body of knowledge of radiation health risks was improving by the mid-twentieth century. Professional organizations had begun to establish maximum exposure standards that while not “safe” could be viewed as manageable. As time passed, more technologically advanced machines tended to release less radiation. Some of the more outrageous practices, like peddling radium as a cure-all for health problems, had been discredited and ceased. A number of professional research articles had appeared in the medical literature, documenting higher-than-expected morbidity and mortality rates of diseases such as cancer after radiation exposure.
But with the introduction of the atomic bomb, the already elusive goal of understanding radiation exposure’s effects became considerably more difficult. New questions arose. How many people actually died at Hiroshima and Nagasaki? What did they die of? How many would die in the future? How much radiation had they been exposed to? What was the exposure to workers who built the bomb, and persons living near these factories? How many of them would die in the future? What about fallout from atom bomb tests? How far did it travel? How much entered human bodies? What kind of risks would ensue in the future?
Mad Science: The Nuclear Power Experiment Page 4