Book Read Free

In the Kingdom of the Sick: A Social History of Chronic Illness in America

Page 11

by Laurie Edwards

Likewise, in discussing the charity walk/jog-a-thons that emerged as popular fund-raising events for disease groups, Samantha King describes the mid-1980s as the era of the “fitness boom,” when millions of once-sedentary middle- and upper-class Americans took interest in physical fitness. In the 1980s, she writes, “the fit body became at once a status symbol and an emblem of an individual’s purchasing power, moral health, self-control, and discipline.”5 This idea is but a recycled version of something we’ve seen for centuries, from the fetid poor whose communicable diseases were considered their own fault, to the hysterical women whose psychosomatic symptoms were ascribed to weak character and the fragility of their gender. For people who contracted the mysterious condition that would become known as AIDS through lifestyle choices, and for those whose malaise and chronic fatigue could not be banished by simply powering through it, this emphasis on the body as a barometer of discipline and character was especially problematic. For people living with cancer, particularly women with breast cancer, the image of the strong, almost indomitable body influenced the concept of survivorship and empowerment, too: we didn’t see cancer patients when they were gaunt and haggard from their chemotherapy. We usually saw them when they were upright, mobile, and participating in charity events.

  This aesthetic ideal took on another dimension when enhancement technologies grew in popularity in the 1980s and 1990s: cosmetic interventions that had once involved removing moles or warts had morphed into more invasive plastic surgery and lifestyle medications. Combine this with the development of gene therapy to treat genetic illnesses in the late 1980s, and bioethicists who could see farther down the road saw a looming shadow: eugenics.6 In the early twentieth century, it had been people with mental or physical disabilities who suffered most from the theories behind eugenics. Now, genetic disease, inherited traits, perhaps even gender could potentially become targets. The distinctions between something that needed to be treated versus something a person might wish to have treated, between what a treatment was versus what an enhancement meant, became increasingly important, whether in terms of ethical concerns or of insurance coverage and medical necessity. We see the influence of that distinction all around us today, if commercials for a medication that treats the “condition” of having short eyelashes are any indication.

  All of this took place against a particular 1980s economic and social backdrop. Despite companies starting to outsource jobs overseas, more working-class Americans finding themselves without jobs and reliant on government assistance, and the nuclear family beginning to disintegrate, individuals were called to take responsibility for maintaining their own well-being and quality of life.7 There has been a minor cult of self-improvement in America since Benjamin Franklin’s self-help homilies, and during the twentieth century, this philosophy of self-improvement changed because it began to be measured against the outward success of others, too.8 People want to stand out for professional and personal accomplishments, and most are loath to be noticed for what is perceived to be wrong with them. Self-improvement and the comparison of self to others offered little solace to patients with chronic illnesses, who could not wish away their debilitation by sheer force of will or exercise, who often used government assistance, and who needed help managing medical expenses. The Benjamin Franklin brand of self-improvement also implies a long-standing sense of self-reliance that people with chronic illness can’t always ascribe to, no matter their desire.

  “We subscribe to a mythos in this country of self-sufficiency. This is thoroughly a myth, in my view, but it leads to a general idea that people need to take care of themselves (even though the people arguing this point of view often benefit greatly from the labors and deprivations of others),” says Duncan Cross.

  “As much as self-sufficiency may seem like a virtue, however, it’s illusory and also antithetical to a more important virtue: solidarity,” he says. When he hears people talking about self-sufficiency, what he hears is a refusal to empathize with patients who have debilitating chronic illness. Solidarity among patient activists of the 1980s was inspired by a response to stigmatization of people with chronic illness implied by the idealization of vigor during the fitness boom, the resentment of those who relied on assistance during times of economic need, and the pervasive attitude that physical illness is related to moral character.

  “There is a sizeable minority in this country which believes either explicitly or subconsciously that people deserve what they get. If you’re sick, it must be because you did something awful, or are an awful person … Obviously, relatively few people who have chronic illnesses did anything to create those diseases, and even those who did don’t deserve the suffering that results,” Cross says. In many ways, this is another version of the “us versus them” mentality Janet Geddis has pointed out and of the internal hierarchy of illness Emerson Miller first described. We start with the healthy versus the sick, but the divisions keep on going. Some people believe that if you do X behavior, you deserve Y consequences. For the same reasons that the biomedical model of illness popular at this time fell noticeably short, this myopic view on suffering and blame falls short, too.

  With all this in mind, the 1980s and ’90s were a crucial time in terms of patient advocacy and empowerment and, in some respects, were a continuum of the evolution of basic human rights and dignity from previous decades. AIDS, breast cancer, and CFS took on social manifestations as well as physical ones: we see HIV/AIDS patients stigmatized for having a disease that is visible for the wrong reasons (lifestyle choices); we see breast cancer patients who are embraced by society and consumerism as being more sympathetic victims of a known disease; and CFS patients (mainly middle- and upper-middle-class females) are viewed with skepticism and doubt because they are sick with an illness we cannot “see.”

  People, Not Patient(s): The Early HIV/AIDS Movement

  The World Health Organization (WHO) has classified the early AIDS epidemic into three main categories: the silent period between 1970 and 1981, the initial discovery period between 1981 and 1985, and the mobilization period between 1985 and 1988.9 Doctors first noticed an unusual type of pneumonia and other immune irregularities in a handful of gay men in California; by the time the first paper dealing with AIDS was published in Morbidity and Mortality Weekly Report in June 1981, 250,000 Americans had been infected with the virus.10 Technically, AIDS wasn’t named as such until 1982, and the first heterosexual cases were confirmed in 1983. On both coasts, these early years were already busy ones for the People With AIDS (PWA) self-empowerment movement. These patient activists realized quickly that they were fighting for their very survival and could not afford to be passive.

  A fundamental characteristic of earlier health movements, like the women’s health movement, was the idea that change was grassroots in nature, not a top-down movement propagated by those in power. The early HIV/AIDS movement is a striking example of this change-from-below: a patient population took on the same medical and political institutions that would prefer to overlook their plight. In 1985, Ryan White, a young boy who contracted AIDS through a blood transfusion to treat his hemophilia, was denied entry to school because of his disease status. When questioned if he would send his own children to school with a student who had AIDS, then president Ronald Reagan said during a press conference that “It is true that some medical sources had said that this cannot be communicated in any way other than the ones we already know and which would not involve a child being in the school. And yet medicine has not come forth unequivocally and said, ‘This we know for a fact, that it is safe.’ And until they do, I think we just have to do the best we can with this problem. I can understand both sides of it.”11 Reagan would ultimately reverse federal policy regarding AIDS by his term’s end, mandating that all people with the virus, whether they were symptomatic or not, were protected against discrimination from any institution or organization that received federal funds. By 1988, the government was taking a more visible role in disseminating information about AIDS: every
household in the country received an “Understanding AIDS” brochure, a shortened version of then surgeon general C. Everett Koop’s report on AIDS.12

  Unlike the majority of illnesses we’ve discussed so far, HIV/AIDS is ultimately fatal. Compounded by the extreme social stigma surrounding it, patients faced what Dr. Joe Wright calls a social death, too. “When you have a diagnosis that is fatal, people start thinking of you as if you are already dead or functionally dead, and people pity you and you don’t have a voice in politics because you are already dead,” he says. This social death was devastating to the AIDS patients of the early 1980s, who were already marginalized due to the stigmas surrounding homosexuality and the “gay plague.” Psychologist Gregory Herek, who has researched and written about stigma, illness, and HIV/AIDS, notes that being infected with HIV is a defining characteristic that relegates the person to what he calls a socially recognized and negatively perceived category.13 Compared to most chronic conditions, HIV/AIDS is particularly damning in terms of social stigma because it is a condition largely considered to be result of an individual’s actions; it is incurable and ultimately fatal; and it is perceived to be a risk to others.14 The fact that it first manifested itself within a much-aligned population made things much worse.

  Out of that marginalization came the most concerted and successful grassroots advocacy movement in the history of modern disease. These early HIV/AIDS activists were in the fight for their lives, and their efforts led to some monumental changes in the way the health system responded to patients with HIV/AIDS: reductions in the price of medications to treat the disease, a significant increase in funded research, and more expedient drug trials.15 They also took a comprehensive look at the needs of their fellow people with AIDS and fought against inequalities in health care, as well as against discrimination and injustices within the health insurance industry. From the daily realities of illness—medications and access to appropriate health care—to the development of less discriminatory federal policies, to more abstract gains, such as replacing fear among much of the public with compassion and understanding, the amount of change AIDS activists brought about in less than one decade is nothing short of incredible.

  The neglect and lack of attention to the needs of people with AIDS, coupled with negative attention when it received any, lasted through the mid-1980s. Tellingly, it was only when the threat to heterosexuals became apparent that AIDS became more than the afterthought gay disease. Ulrike Boehmer, scholar and author of The Personal and the Political: Women’s Activism in Response to the Breast Cancer and AIDS Epidemics, first heard about AIDS in 1982, when the prevailing sentiment was that traditional institutions were led by heterosexuals who did not care if gay patients lived or died. Boehmer, however, didn’t see AIDS as first and foremost a health issue. “It was not the disease component of AIDS that held my attention; from the beginning, I perceived AIDS as a gay and lesbian rights issue, and that was why it affected me long before I ever knew someone who had been infected,” she writes.16

  Its decidedly political slant separates the disease from many others. However, in many ways, the message of early HIV/AIDS activists was a familiar one. It is no coincidence that the seminal Denver Principles of 1983, the first codified statement of rights for people living with AIDS, echoed the spirit of the disability movement, women’s health movement, and the Patient’s Bill of Rights first formalized about a decade earlier. Two leaders of the PWA movement, Michael Callan and Dan Turner, reflected on their work in the early 1980s and its roots, saying that “Part of the widespread acceptance of the notion of self-empowerment must be attributed to lessons learned from the feminist and civil rights struggles.”17 These early activists renounced victimhood and demanded to be called “People With AIDS,” a people-centered semantic difference many patient groups still emphasize today. Their statements involved far more than simply the right for treatment or information regarding their diagnoses. They had to lobby against eviction from their homes, against job discrimination based on their medical status, against the isolationism and pariah-like status their disease conferred upon them.

  The Denver Principles concluded with a request for basic human respect and the right to both live and die with dignity.18 The Denver Principles served as a catalyst for the formation of the AIDS Coalition to Unleash Power (ACT UP) and for the patient empowerment movement for people with AIDS. Though ACT UP is still active today, AIDS activism shifted with the advent of protease inhibitors in the 1990s, the drug cocktail responsible for slowing down the progression of HIV into AIDS.

  Dr. Wright comments that influential members of ACT UP were having dinner with Dr. Anthony Fauci of the National Institutes of Health and were not still mired in hashing out the etiology of the disease itself. This comingling was possible in part because they were involved in setting research policy. “They were not arguing about what caused AIDS and were not fighting on multiple fronts. They were ready to accept many of the terms on which the battle [was] being defined. The broader the agenda gets, the harder it is to find the enemy and fight to win,” he says. This distinguishes early AIDS activism and patient empowerment from many of the same social movements whose existence paved the way for these early successes. Modern-day AIDS exceptionalism—the argument that it does not serve HIV/AIDS patients well to collaborate with other disease groups for common gains—partially stems from this. Hard-fought funding and research and well-organized political and social support institutions thrive because of their sustained, specific focus.

  “People who don’t come from that position object to singling AIDS out and [think it] should filter into the broader public health system. I think that argument is nonsense because without the urgency and hard work, none of this would have emerged,” Dr. Wright says. “The political situation with AIDS is complicated because there is a lot of infrastructure and funding streams. From a sheer political calculation, it might be that people with HIV have the least to gain from making alliances with others: they are either going to bring others up to their level or get dragged down.”

  Given the circumstances in which AIDS advocacy emerged, AIDS exceptionalism makes a lot of sense. But how will it fare now that HIV/AIDS is seen more and more as a chronic disease?

  While still mainly focused on the singular disease, AIDS advocates have developed a comprehensive scope in terms of issues they address that matter in terms of quality of life and survival: health care, medication, poverty, housing, gender inequality, racism, education.19 Susan Tannehill, director of client services at the AIDS Action Committee in Boston, says that a lot of the services her organization offers are poverty-based, such as addressing housing problems and addiction treatment, and that having HIV compounds these problems for their clients. Culturally, white men are still often more empowered than other populations, and issues of gender inequality are still largely at play, too; Tannehill points out that clinical trials are still largely done with men and HIV medications are dosed based on men’s body weight.

  As the focus has shifted domestically, so too has the prioritization of health issues for some people living with HIV—yet another sign that from a physical standpoint, at the very least, the disease has evolved from a more immediate death sentence to a manageable chronic condition. For example, none of Dr. Wright’s patients with HIV consider HIV to be his or her biggest medical problem. “Some of them have problems downstream from HIV, it has not made life easy for them, but when I think about them as their doctor, immune suppression is not the biggest problem … some have effects from meds, but if their biggest problem is hepatitis C or hypertension, they are not quite as urgent about being an AIDS activist or organizing life around that identity,” he says. In fact, he conducted an informal poll of people with diabetes and HIV and they say diabetes, which requires constant vigilance and management, is harder to live with (unless the person is living with full-blown AIDS).

  “If you were to step back, social stigma is very powerful but it is not the only form of suffering there
is,” Dr. Wright says. That idea that suffering crosses lines of diagnosis, gender, social status, and prognosis is not only the piece that connects case studies of HIV/AIDS, breast cancer, and CFS, but it is what connects patients across the disease spectrum. More than that, it is the equalizing variable between the healthy and the sick—if only we recognize it as such.

  How Breast Cancer Advocacy Changed the Stakes for Chronic Illness

  Breast cancer is a particularly interesting example of looking at disease through the biomedical model, where the emphasis is on cures and biological origin of disease, rather than on prevention of disease and societal contributions to illness. This preference extends to cancer research and funding, where the focus is on various surgical and chemical therapies and early detection. This is particularly evident when it comes to mammograms, which are often misconstrued as preventive. A mammogram does not prevent cancer; rather, it allows for earlier detection and intervention.20 In 2012, researchers expect more than two hundred thousand new invasive breast cancer diagnoses alone. Clearly, earlier detection and curative therapies are important and urgent. However, true prevention is complex and multifactorial, and patients already living with cancer—and the side effects of its treatment—have diverse needs some advocates fear may get overlooked in the quest for cures and research dollars.21

  Debate over methods of mastectomy and other treatments had existed since the late 1880s; by the 1920s, breast cancer surgery was the most common type performed in the world.22 In 1974, legendary breast cancer patient and activist Rose Kushner pushed to separate her surgical biopsy from her treatment decision in order to have the time and opportunity to decide whether a mastectomy was appropriate. This bucked the long-established trend of putting women under anesthesia who wouldn’t know if they had lost a breast or not until they woke up.23 Before informed consent became codified policy for breast cancer patients, Kushner herself usurped consent when it came to her own body.

 

‹ Prev