Strange Glow

Home > Other > Strange Glow > Page 11
Strange Glow Page 11

by Timothy J Jorgensen


  NONE TOO SOON: WORKER PROTECTION ARRIVES

  The Schneeberg miners and the radium dial painters were the most spectacular early examples of many people being exposed to radiation in the workplace with disastrous consequences. Still, there had been a long chain of anecdotal reports preceding them: dermatitis of the hands (1896), eye irritation (1896), hair loss (1896), digestive symptoms (1897), blood vessel degeneration (1899), inhibition of bone growth (1903), sterilization in men (1903), bone marrow damage (1906), leukemia in five radiation workers (1911), and anemia in two x-ray workers (1912).37 Various cancers among medical radiologists would soon be added to this list.38

  The above reports, coupled with the findings related to the Schneeberg miners and the radium dial painters, underlined the need for radiation protection standards. The Roentgen Society, whose membership was primarily made up of medical x-ray workers, began pushing for safety standards as early as 1916.39 In 1921, the British X-ray and Radium Protection Committee endeavored to establish an occupational dose limit for radiation, which they ideally sought to define in terms of “a reproducible biological effect, expressed as much as possible as a measurable physical unit of radiation dose.”

  Considering the list of known biological effects of radiation, choosing a relevant endpoint was not easy. Even so, the health effect of most concern was cancer, and the medical community’s thoughts on the origins of cancer at that time were greatly influenced by the work of Rudolph Virchow (1821–1902), a physician and scientist who correctly postulated that cancerous cells originated from normal cells. This was a major advance in the understanding of cancer biology. Furthermore, Virchow had witnessed in his medical practice that nonmalignant lesions often turned into cancerous ones. Based on this observation, he had proposed that cancer arose from chronic irritation of normal tissue. And, in fact, there was much circumstantial evidence for this in the case of radiation. It was well known that the cancers radiation workers commonly suffered on their hands from exposure to x-ray beams typically were preceded by years of skin irritation (i.e., radiation dermatitis).

  Because irritation was thought to precede cancer, it followed that radiation doses insufficient to cause irritation would also be insufficient to cause cancer. Based on this principle, skin reddening (e.g., erythema) was adopted as a highly relevant biological endpoint for radiation protection purposes. If radiation exposures were kept low enough to prevent skin reddening, they should also be too low to cause cancers of the skin or any other tissues. Thus, skin erythema was chosen as the most relevant measurable and reproducible biological effect on which to base radiation protection standards.

  With the first part of their charge complete, the committee turned to the second issue. The biological effect needed to be “expressed as much as possible as a measurable physical unit of radiation dose.” This task seemed to be even more straightforward than the first. Since x-rays are a type of ionizing radiation, and the number of ionizations produced is directly dependent upon the amount of energy deposited by the radiation, it follows that one should be able to determine the dose simply by measuring the number of ionizations produced. The problem was simply how to measure the number of ionizations accurately.

  Here, too, that answer seemed straightforward. Since the ions produced are mostly electrons, they can produce an electric current. Measuring the electric current would thus be a good index of radiation dose. In fact, this method of measuring radiation dose was already under development. The device invented to perform the task was called a gas ionization chamber. When such a chamber of gas is placed in a radiation beam, the amount of gas ionization, measured as an increased flow of electrical current through the gas, reveals the radiation exposure level. And to honor the discoverer of x-rays, the unit for this radiation exposure measurement was named the roentgen.40

  With proper tools now in hand, scientists sought to determine the maximum tolerable dose (MTD)—that is, the permissible level of exposure—as expressed in roentgens.41 Studies were performed and data meticulously analyzed. Based largely on studies of people who had worked with radiation for many years without ill effects, a conclusion was finally drawn. In 1931, an MTD for x-rays was announced by the US Advisory Committee on X-Ray and Radium Protection: on any given day, workers were not to be exposed to more than 0.2 roentgen (i.e., one fifth of a roentgen unit).42 This dose limit, however, was based on a relatively small number of workers with rather uncertain exposure levels. It was a “best guess” estimate, but it would have to do until better data arrived.

  Unfortunately, the MTD based on skin erythema arrived too late to save Marie Curie’s group. Her laboratory had chosen another biological effect by which to limit their exposure levels, and their biological endpoint allowed much higher whole body doses. The effect they monitored was anemia, the suppression of circulating blood cells. Since 1921, everyone in her laboratory had been having regular blood work done to monitor their exposure levels. When workers’ blood counts dropped too low, they were pulled off the job for a while and sent out into the country to rest and recuperate until their blood counts returned to normal. Their blood counts typically did return to normal, and this reinforced the erroneous belief that radiation effects on health were always reversible.43

  The use of anemia to monitor worker radiation exposures had a tragic effect of double jeopardy. First, it allowed whole body doses much higher than would have been permissible based on the skin erythema MTD. In fact, the Curies and their staff were resigned to their constant skin dermatitis as the price that they had to pay to be in the forefront of science. Pierre even boasted: “I am happy with all my injuries. My wife is as pleased as I. You see, these are the little accidents of the laboratory. They shouldn’t frighten people who live their lives [purifying radioactivity].”44 Second, because the Curies assumed that since their blood counts fully recovered with sufficient rest from work, so would all other health effects. They further supposed that there would be no long-term consequences from living a life with rollercoaster blood counts. In this they were sadly mistaken.

  The anemia threat came to a head for Marie Curie in 1925 when a terrible radiation accident killed two of her former students—Marcel Demenitroux and Maurice Demalander. The two engineers were constructing an irradiation machine for medical use in a small factory outside of Paris. They sustained high radiation doses from the radioactive thorium they were using in the machine. The two lingered in an anemic state for months and ultimately died within four days of each other. Remarkably, it wasn’t until their last days of life that the pair realized their coincidental illnesses were due to their high radiation exposures. The deaths shook Marie Curie, but she was in denial. Although she accepted that the radiation had killed the engineers, she attributed their deaths to the fact that they were living in quarters next to the laboratory because of their heavy workload and thus “didn’t have a chance to get out and take the air.”45 In fact, Curie found a whole host of explanations for illnesses among her workers that were, in retrospect, obviously related to radiation.46

  With time Curie developed her own illnesses, which she tried to keep secret. Ultimately, she admitted to herself that they were due to radiation. Increasingly frail and virtually blind from cataracts, she tried to continue working, but fell seriously ill in the summer of 1937. She traveled to an infirmary in the Savoy Alps to recuperate, but it was too late. Her doctor examined her blood work and diagnosed “anemia in its most extreme form.” Her fever diminished some, but her fate was already sealed. Soon, she was dead. Her doctor later pronounced that her bone marrow never rallied to restore her blood counts “probably because it had been injured by a long accumulation of radiations.”47

  Based on the experience of the radium dial workers, it had been generally assumed that Marie Curie, too, had been poisoned from her work with radium—the radioisotope that had won her a Nobel Prize and started the whole commercial radioactivity craze. Then, in 1995, her coffin was exhumed and transferred to the Pantheon, the national maus
oleum of France. To be interred in the Pantheon is France’s highest burial honor, and the event was to be attended by many French dignitaries.48 There was some concern about the safety of the process, however, because Curie’s skeleton was suspected to contain radium. So the disinterment was carried out by the French Office of Protection Against Ionizing Radiation (OPRI).

  When her gravesite was opened, Curie was found to have been buried in a multilayered nested coffin. The inner coffin was made of wood, the next of lead, and the outer one again of wood. To estimate the radioactivity within, the coffins were punctured and an internal air sample taken to measure its radon content. Since radium decays to produce its progeny, radon, a radon measurement allows estimation of the parent radium level that produced it. Surprisingly, the radon levels of the coffin air suggested that the radium content of Curie’s body was not high enough to have been the cause of her death. Apparently, Marie Curie had been right about her radium exposure. She never ingested or breathed radium anywhere near the levels that the radium girls had. Instead, the facts suggested that Curie’s radiation ailments were likely the result of gamma and x-ray exposures coming from radiation sources outside of her body, rather than from any internal exposures due to ingestion of radioactivity, just as she herself had suspected.49

  WOMEN AND CHILDREN FIRST

  Despite the fact that a daily dose limit for x-rays based on skin reddening had already been set, by 1931 there was still no dose limit set for radium ingestion. Then, in June 1931, a preliminary report on the health of radium dial painters employed after January 1, 1927—the date that lip pointing had been banned—was presented by a team of US Public Health Service scientists at an American Medical Association (AMA) conference.50 The scientists presented all the exposure and health data that were collected from these radium dial painters who had never lip pointed. They made estimates of the amount of radium radioactivity in each worker’s body and then looked for a biological change that could be used as a relevant endpoint for health effects, similar to how skin reddening had been used as a basis for setting x-ray dose limits. Despite the cessation of lip pointing, the scientists found that workers continued to accumulate radium in their bodies, albeit at a much slower rate. This was presumably due to inhalation and ingestion of the radium dust with which workplace environments were still contaminated.51

  X-ray images of their jaw bones showed detectable changes in workers with as little as one microgram of radium in their systems; but no apparent adverse health effects were associated with these changes, and it was not clear whether dental procedures may have contributed to them. There were also slightly lower than average red blood cell and hemoglobin levels in the workers, but the differences were not statistically significant.

  No radiation injuries could be detected in any radium dial painter with fewer than 10 micrograms of internalized radium. Borrowing the concept of MTD from the x-ray protection community, some scientists proposed to define a maximum permissible body burden for internalized radium as 10 micrograms.52 Others thought it should be lower than 10, and some thought it should be zero. Consequently, the committee published its final report in 1933 without recommending a maximum body burden. Apparently, there was too much controversy within the committee on whether 10 micrograms or something lower should be the appropriate limit.

  Remarkably, it took the threat of another world war to jostle the radiation protection community enough to finally arrive at an MTD standard for radium ingestion. In 1941, on the eve of the United States’ entry into World War II, military preparedness required the production of large numbers of fluorescent dials for various pieces of military equipment. Frustrated with the lack of progress in setting a safety limit for radium, the US Navy insisted on having an MTD that they could use as a protection standard for radium dial production.53

  Lauriston Taylor (1902–2004), a scientist who would later become the preeminent advocate for the establishment of scientifically based radiation protection standards, assembled a committee of experts under the auspices of the National Bureau of Standards (now called the National Institute on Standards and Technology, or NIST) and gave it the charge of establishing an MTD for radium.

  Taylor’s interest in radiation protection was personal. While calibrating x-ray exposure meters in a laboratory at the National Bureau of Standards in 1928, he was involved in a radiation accident of his own doing. He was calibrating the meters using a high intensity x-ray beam. While swapping instruments in and out of the beam, he forgot to replace a protective lead shield. He didn’t notice that the shield was missing until the beam had been on for several minutes, and it wasn’t clear whether or not he had been exposed to a lethal radiation dose. It was later estimated that he received about half of a lethal dose, and he ended up having no long-term health consequences from his exposure.54 Nevertheless, Taylor was changed by the experience. He became a pioneer in the field of radiation protection, later founding the National Council on Radiation Protection and Measurements (NCRP), which exists to this day in Bethesda, Maryland, just a few miles away from NIST.55

  One of the radium committee experts, Robley D. Evans (1908–1996)—a scientist from the Massachusetts Institute of Technology (MIT) who was well experienced with measuring radium body burdens—suggested using a “wife or daughter” standard to arrive at an acceptable number.56 He asked his all-male committee to review all available data on the effects of ingested radium on dial painter workers and recommend an MTD they would feel comfortable letting their wife or daughter accumulate in their bodies. Starting with the 10 microgram limit that the earlier Public Health Service team had tossed around, the committee worked downward until 0.1 micrograms (i.e., 1/100th of the 10 microgram starting value) was unanimously agreed upon to be safe for people to ingest or breathe. In this way, an MTD standard for radium was born.

  Thus, as World War II approached, the radiation protection field was in full stride, and the quantitative means to provide safety standards for radiation exposures had been established. Limits for x-ray exposure in the workplace were defined, and an MTD for radium was in place.57 Progress! Or so it seemed.

  MOTHER OF INVENTION: RADIATION PROTECTION STANDARDS FOR THE MANHATTAN PROJECT

  Although the chosen approach for radiation protection was ostensibly logical and practical, it was not without its critics. As mentioned, the radiation MTD value was based on a relatively small number of workers with rather uncertain exposure levels. More worrisome, it relied on two major assumptions that had not been confirmed.

  The first assumption was that there actually existed a maximum “tolerable” dose (MTD) for radiation. In other words, were there really radiation dose levels that the human body could tolerate with no risk of adverse health effects? The MTD was a concept that had been borrowed from chemical studies. For chemical toxins, it had long been established that there are dose levels below which the body can biologically inactivate a foreign chemical, by either metabolizing it or excreting it, and no significant health hazard should thus exist. Nonetheless, at some higher dose level, the organ systems become overwhelmed and can no longer adequately handle the burden of dealing with the chemical. Doses that exceed the body’s capacity to neutralize a chemical’s adverse effects are toxic, making the otherwise benign chemical a poison. This is a fundamental principle of toxicology that was first expounded by Paracelsus in the sixteenth century. (The very same Paracelsus who first described the miners’ lung disease, bergsucht.) Paracelsus’s most famous quote is: “The dose makes the poison.” By this he meant that all chemicals become poisons at some level of dose. The trick for toxicologists is to determine the dose below which the body can safety tolerate the chemical. Unlike for chemicals, however, no proof of tolerance existed for radiation.

  The second major assumption was that all radiation doses are equal in terms of biological effects. This, too, had not been verified, and there were good reasons to believe that it was not true in all cases. Evidence was already accumulating that identical doses from
different types of radiation could result in different levels of biological effects. That is, the biological effects were dependent upon the type of radiation. And it was also beginning to be appreciated that factors such as dose rate, time between exposures, and other conditions of irradiation could influence biological effects. Apparently, all doses were not created equal.

  The bottom line was that very little research on the biological effects of radiation had been done to this point. Although radiation was by then in widespread use in both medicine and commerce, virtually nothing was known about the mechanisms through which it produced its effects on health. Many scientists found that current state of knowledge completely unsatisfactory. Among them were the leading nuclear physicists of the day, most of whom were former students of the first radiation scientists and who now represented the next generation of atomic geniuses. They knew about the health consequences from radiation exposure suffered by their old professors, and were concerned about their own health.

  The discovery of fission just prior to World War II had sent Allied and Axis nations alike on a race to create a nuclear bomb (i.e., atomic bomb). Production of a nuclear bomb would rely upon amassing an amount of uranium sufficient to reach criticality and unleash an uncontrolled chain reaction of nuclear fission that would be capable of instantaneously releasing a massive amount of energy. In response to a secret letter from Albert Einstein warning of the prospect of Axis forces obtaining this technology,58 President Franklin D. Roosevelt initiated an enormous government project in 1939 to investigate the possibility of producing such a bomb. This secret project became known by its code name, the Manhattan Project.

 

‹ Prev