In the Kingdom of the Sick: A Social History of Chronic Illness in America

Home > Other > In the Kingdom of the Sick: A Social History of Chronic Illness in America > Page 3
In the Kingdom of the Sick: A Social History of Chronic Illness in America Page 3

by Laurie Edwards


  Christianity was just one of many faiths undergoing adaptation as a result of the changing world. It has more emphasis here since some of the themes popular in the Christian response to suffering and disease are still evident today. Even the word stigma itself, one so heavily associated with current experiences with disease, has its roots in early Christian tradition. A literal meaning conjures up images of the physical markings of crucifixion, but as psychologist Gregory M. Herek notes, more complex definitions include literal or metaphorical marks that infer an individual is “criminal, villainous, or otherwise deserving of social ostracism, infamy, shame, and condemnation.”22

  Prior to the Black Death in the fourteenth century, it had been eight hundred years since Europe had last been besieged by major epidemics. The collapse of the Roman Empire meant less travel and commerce with Asia and, therefore, less contact with new diseases and infections. Now, with fourteenth-century towns like Venice and Genoa emerging as centers of trade and travel with more distant lands, the opportunity for disease and epidemics to infiltrate a new population was ripe.23 Increased commerce meant increased urbanization, and poor sanitation and overcrowded conditions meant the population was particularly susceptible to communicable diseases. This relationship between changes in the way people live and work is a constant in the social history of disease.

  Bubonic plague, the cause of the Black Death, was thought to infect humans from the fleas carried by rats, though modern experts believe some human-to-human transmission was possible, given how quickly it wreaked havoc. Killing an estimated twenty million people, the Black Death remains Europe’s most catastrophic epidemic, having wiped out a quarter of the population.24 The impact of such devastation on the European psyche is telling. It had become “a crucible of pestilences, spawning the obsessions haunting late medieval imaginations: death, decay, and the Devil … the Grim Reaper and the Horsemen of the Apocolypse.”25 Unfortunately, responses to the Black Death are predictable through the lens of history. To the many who believed the plague was the work of divine retribution, acts of self-flagellation, prayer and fasting, and the religious persecution of Jews and others outside the faith were seen as appropriate defenses. Roy Porter recounts the horrific fate of thousands of Jews locked in a wooden building and burned alive, one of many instances of retaliation and violence during the Black Death.26 Physicians, powerless to effect any substantive treatment for individual patients, could do little to quell the public health debacle unfolding.

  French philosopher Michel Foucault’s Panopticism, published in English translation in 1977, deals graphically with response to plague, describing the total lockdown enforced—the census, the front doors locked and checked by specially appointed officials, the total submission of medical and policy decisions to the magistrates. Order trumps chaos, power dominates disease. Given the rapid transmission and onset of the infection, and the lack of concrete physiological understanding of it, the extreme situation Foucault depicts in seventeenth-century France is understandable, if unappealing. Twenty-first-century movies like I Am Legend or Contagion tap into similar fears over uncontrollable outbreaks and the fragility of human life in the face of pathogens we cannot fight.

  Medieval attitudes toward disease and the body perceived women as the “faulty version” of the male who were weaker because “menstruation and tearfulness displayed a watery, oozing physicality … Women were leaky vessels … and menstruation was polluting.”27 As patient narrative, research, and history will illustrate, gender remains an incredibly important variable in the chronic illness experience. Partly, this is because more females than males manifest chronic and autoimmune conditions. However, throughout history, deeply ingrained ideas about women as unreliable narrators of their pain and symptoms, as weaker than men, and as histrionic or otherwise “emotional” have had a profound impact on their ability to receive accurate diagnoses and appropriate care.

  On the heels of the devastation wrought by the plagues of the Middle Ages, the Renaissance and Enlightenment were periods of progress and advancement. The invention of the printing press and the resulting printed health material made knowledge about the human body and disease (however incomplete) widely available for the first time. The gains in health literacy that printing made possible over time marked a huge shift in the understanding and treatment of diseases.

  By the eighteenth century, physicians still couldn’t isolate the cause of infectious disease, so Hippocratic thoughts about individual responsibility for illness continued to dominate mindsets. American physician Benjamin Rush emphasized the importance of getting the patient’s history directly from the source, and focused on all the daily habits and behaviors that might play a role in the patient’s illness. His interest in the association between chronic disease and lifestyle are significant, as is his division between acute and chronic disease.

  “In chronic diseases, enquire their complaints far back and the habits of life … Pay attention to the phraseology of your patients, for the same ideas are frequently conveyed in different words,” Rush counseled his peers.28 With acute illness, the precise daily habits that took place the week preceding the manifestation of symptoms were particularly important. Rush’s emphasis on patient history as a primary diagnostic tool took place in the context of improved standards of living and transportation across Europe and in the United States, which meant a now-predictable rise in diseases associated with indulgence and inactivity. Relying so heavily on patient history and lifestyle was logical, particularly since there was little else physicians could point to in order to assign cause (or blame) for disease. Other popular theories of the time included a focus on environment and external factors like squalid living conditions and dank areas, though those too brought in associations about wealth, status, and worth. Still, as a precursor to more current attitudes toward patients with chronic disease, this link with lifestyle and behavior is a key concept to carry forward.

  The greatest dichotomy of this time period, however, was that while physicians gained new skills and attained a more elevated status, patients themselves saw little benefit from these developments. Even the early use of the microscope shows an interesting lack of focus on the patient, and a divergence from medical research as inherently therapeutic: while physicians used microscopes to study tissue, it wasn’t until the nineteenth century’s breakthroughs in bacteriology that microscopes were used in the process of treating patients.

  Disease in the Nineteenth and Early to Mid-Twentieth Centuries

  Simply put, the nineteenth century was the century of the germ. Until physicians could see disease under the microscope, the same kind of guesswork that characterized disease and its treatments from its classical roots persisted. For example, well into the nineteenth century physicians believed that illness came from miasmas—the gases that seeped out from subway systems, garbage dumps, and open graves.29 The changes wrought by the Industrial Revolution and the emergence of capitalism affected virtually every part of daily life. More people moved to cities and worked in factories, and overall improvements in employment availability and children who could contribute economically to their families meant an increase in population growth. From unsafe working conditions to slums where infectious disease found places to thrive, a now-familiar historical pattern emerged: the technology that yielded improved transportation and innovations in production also paved the way for a new wave of communicable disease and social anxiety.

  A fundamental shift in the understanding of disease—and in the way we perceived patients with communicable and other diseases—began with Louis Pasteur’s identification of bacteria and the role of germs in causing infection. Before that, leeches, laxatives, and brandy were among the most common cures of the day.30 By 1881, Pasteur had perfected the vaccination method, though it wouldn’t be until 1954 that a polio vaccine suitable and effective for humans was introduced.31 Nineteenth-century attitudes toward vaccines prevented universal vaccinations from happening. As we will see when we explore cu
rrent perspectives on vaccines and autism, the combination of fear that the government was encroaching on civil liberties and concern over the safety of the procedures that characterized the opposition to vaccines looms heavily in our twenty-first-century consciousness. The difference between society’s perspectives then and now is that in the years between vaccines have largely eliminated many of the most harmful public health risks, such as polio and smallpox.

  Vaccination is an approach to disease prevention so profound that it is in large part responsible for the emergence of chronic illness as a domestic public health and social issue in the twentieth century. Enough people did not die or become crippled and incapacitated from infectious disease that they began living long enough to acquire and suffer from chronic conditions. For example, from 1930 to 1980, self-reported illnesses rose by 150 percent, a clear indication that a population that lived longer wasn’t necessarily feeling better—and an idea that figures prominently in the social history of chronic disease.32

  Pasteur’s work on germ theory ushered in the burgeoning field of microbiology. Using this theory, Pasteur’s contemporary Robert Koch was able to identify the bacteria that caused both cholera and tuberculosis (TB).33 These infections were scourges, particularly in heavily populated urban areas, and brought with them many unfavorable associations and connotations. Perhaps one of the most famous representations of TB appears in Susan Sontag’s extended comparison of it to cancer. While cancer was once associated with a repressed personality and middle-class anxiety, TB was the stuff of excess emotion and poverty. In Illness as Metaphor, Sontag observed that “TB is a disease of time; it speeds up life, highlights it, spiritualizes it … TB is often imagined as a disease of poverty and deprivation—of thin garments, thin bodies, unheated rooms, poor hygiene, inadequate food … There was a notion that TB was a wet disease, a disease of humid and dank cities.”34 This process of how identifying the origin of a disease changes the perceptions of patients living with it—or, fails to change the perception of patients—is one we still grapple with two centuries later.

  In W. Somerset Maugham’s revealing early-twentieth-century short story “Sanatorium,” assumptions about the “typical” TB patient are powerfully laid bare. In describing one of the patients sent to recover from TB in a sanatorium, the author writes, “He was a stocky, broad-shouldered, wiry little fellow, and the last person you would ever have thought would be attacked by T.B … He was a perfectly ordinary man, somewhere between thirty and forty, married, with two children. He lived in a decent suburb. He went up to the City every morning and read the morning paper; he came down from the City every evening and read the evening paper. He had no interests except his business and his family.”35 All the things that make this patient a surprising candidate—he is gainfully employed, stable, married; in short, a respectable man with respectable middle-class tastes and aspirations—are what stand out here. He did not deserve his unlikely affliction.

  The public health response to disease outbreak in America also reflected the nation’s emerging evangelical bent. Since disease was thought to be due to poor hygiene and unsanitary conditions, clean living was not just a health issue but a moral one as well. It fell to religious philanthropists to preach against the sins associated with unclean living, from drinking and immoral behavior to the alleged vices of atheism and greed. Such actions further demarcated the healthy—middle- and upper-class religious activists—from the ill, those languishing in slums whose slovenly living conditions and life choices made them culpable in their sickness. Being able to source the origin of infectious disease to its microbial roots was the first step in breaking down such misconceptions.

  Other nineteenth-century developments that influenced the experience of chronic illness today include the advent of anesthesia, the beginning movement toward patient advocacy, and the professionalization of nursing. Until the 1840s, physicians had no effective, safe way to lessen the pain of surgery. The introduction of nitrous oxide, chloroform, and ether produced immense relief from the pain of surgical intervention. It also reflected a shift in physicians’ attitudes toward patients and a higher priority on alleviating suffering.

  Another advancement in the consideration of the patient can be traced to the nursing profession. Prominent figures like Florence Nightingale and Clara Barton exemplified the holistic approach to patient care that characterizes nursing, and represented a marked departure from the tendency of other medical professionals to focus on singular aspects of a patient’s condition (i.e., the cause or the treatment). Galvanized by the suffering of soldiers, Nightingale was stalwart in her work to improve living and sanitary conditions for her patients. The patient as an individual, entitled to respect and compassion, was a concept made flesh by Nightingale and the cadre of professional nurses she mentored. Likewise, activists like Dorothea Dix and Alice Hamilton worked to make public the deplorable living conditions and inhumane treatment of the mentally ill and the urban poor.36 This indicated a new interest in health-care advocacy, a concept that would wholly redefine the lives of many different types of patients more than a hundred years later, most especially those with chronic diseases.

  The world was still in the grip of deadly epidemics, though, as witnessed by the staggering transcontinental death toll of the 1918 influenza pandemic. Updated research suggests that the strain of the influenza virus that sprang up during the 1918–19 flu season killed between thirty and fifty million people globally, and killed an estimated 675,000 Americans. World War One had killed fewer people than the flu pandemic.37

  Successes in identifying infectious disease and the post–World War Two development of antibiotic therapy led to the assumption that though infections might still cause temporary discomfort, they were no longer a serious threat to either survival or quality of life.38 Was this a sign of naïveté? Arrogance? Optimism? Or perhaps, a combination of all three? With the benefit of hindsight, the weakness of this position is easy to see: for one, antibiotics only treat certain strains of bacteria, and are not effective in treating the many viruses that still pose a threat to public health. In addition, as we see all too frequently today with infections like methicillin-resistant Staphylococcus aureus (MRSA) and flesh-eating Streptococcus, bacteria evolve into strains resistant to the medications developed to treat them. As a patient with a compromised immune system who is prone to infections, I know firsthand the danger of antibiotic resistance. As a preschooler, I spent several weeks in an isolation room in a hospital, tethered to an IV pole to receive Vancomycin, the drug used to treat staph infections like the one I had spreading from my ears to my brain. Knowing that some staph infections are now resistant to Vancomycin, a powerful “end-of-the-line” treatment for these life-threatening infections, scares me. Similarly, with only a few antibiotic options left that reliably treat my lung infections, resistance is not just a buzz-worthy topic for me; it is a real concern.

  For better and worse, twentieth-century experiences with diseases like polio forever altered the way we view medical science’s ability to treat disease. At last, humanity could respond to the infectious epidemics that had wreaked havoc for centuries and do more than merely identify them—we could actually prevent them. Outside the spheres of public health and research, we don’t hear or talk too much about polio anymore; its omission in our lexicon is a luxury modern medicine affords us. But for the generation forced to dwell in iron lungs and the legions permanently crippled by polio, its specter was menacing. Many of the illnesses we grapple with today are a product of the way we live and work, just as living and working conditions in the past contributed to the rise of polio. Roy Porter deftly characterized the complex relationship between human progress and disease when he wrote, “Thus to many, from classical poets up to the prophets of modernity, disease has seemed the dark side of development, its Jekyll-and-Hyde double; progress brings pestilences, society sickness.”39

  Though ancient in origin, the emergence of polio as a major medical threat in the 1900s can be tra
ced directly to the processes of urbanization. Spread through infected fecal matter, the dominant strains of the polio virus were introduced early on to infants who dwelled in crowded homes with rudimentary plumbing, sanitation, and hygiene. Once more modern forms of sanitation and waste removal and treatment were developed in the 1900s, the immunity that early exposure to the virus gave patients happened less frequently.40 As immunity decreased, incidence of the more serious manifestation of polio, paraltyic polio, which involved the nervous system, increased. By the 1950s, polio kicked up most severely during the warm summer months and primarily affected children. Parents fled urban areas and communities banned the use of public swimming pools.41 The year 1952 brought with it the worst polio epidemic in American history; 58,000 cases were reported, including 3,145 deaths.42

  That same year, 1952, Jonas Salk tested the first polio vaccine. He used a dead virus injected into patients to help build up natural immunity, and in 1954 more than one million children were given test vaccinations.43 Since polio was a disease that primarily affected children, treating it was a cause particularly vaunted by the American public. Children are understandably at the top of the illness hierarchy. By the late 1950s, a live virus was used to produce an oral vaccine, which was more popular since it meant patients didn’t need any shots. The World Health Organization (WHO) made fully eradicating polio a worldwide effort in 1985.44

 

‹ Prev