Book Read Free

Blood Matters

Page 8

by Masha Gessen


  But the key word here is associated. Most of the knowledge that drives the decisions of mutants like me concerns associations, not causes. What caused the cancer in the other 419 women in the study, the overwhelming majority? And why would consanguineous marriages be important? That would suggest that a recessive gene—not a dominant one like BRCA1 or BRCA2—was implicated. Could it be that there were other causes, other mutations? Could it be that even the known BRCA mutations required a co-agent, possibly another mutation or a combination of mutations? Could a causal relationship be proven at all?

  The study was conceivably a huge step forward for some Pakistani women, who may now be able to take advantage of genetic testing, but the field is so young that every step forward serves best to illuminate the vastness of the unknown.

  Many of the advances in genetics have become possible as a result of sheer computing power available to scientists today. Our outposts at the genetic frontier are giant boxes of data—hangar upon hangar filled with machinery that works around the clock to slice and recombine. Most of the studies look like the Pakistani one: A wealth of epidemiological information on one side is matched up with sheaths of genetic code on the other, with little regard for the missing logical links.

  By the logic of this science, a human being would best be sliced up into microscopic samples, each one to be checked for flaws and irregularities. In essence, the state-of-the-art approach to genetic-cancer prevention is based on the same carpet-bombing philosophy: A mutant like me is generally advised to undergo two mammograms a year, two MRIs (actually two procedures each time, owing to the number of breasts on the human body), get felt up by a breast oncologist four times a year, and get as many ultrasounds as are necessary to clarify the results of any of these procedures. Each of my own mammograms occasioned follow-ups and enlargements, all of which were largely inconclusive. My first MRI showed a mass in my left breast. I had an ultrasound-guided biopsy. I waited for the results for two weeks, in accordance with the usual rules of such things: calm at first but reaching a near-desperate level of helpless anxiety late on a Friday afternoon, when I still had not heard from the oncologist. Then she called: The mass was benign. I sat on a bench, hugging my cell phone, looking up at the warm late-April sun, taking deep breaths.

  Almost exactly five years earlier I had sat on another bench, under a different sun. I was in Budapest, having just arrived there from Belgrade after spending six weeks in Yugoslavia during the NATO bombing campaign. I remember looking up at the blue sky, taking in the warmth, breathing like I imagine a diver must breathe coming up from a great depth. I was remembering what safety was like. It was disorienting and exhilarating.

  Now, in Cambridge, Massachusetts, I felt none of that. I had been given a reprieve. But it was strictly temporary: I was now a professional patient. I would always be ill until proven healthy, and then I would have to prove it all over again in another month or two.

  In the much-discussed and much-amended lists of breast-cancer risk factors there is a fascinating indicator: Previous breast biopsies increase the risk that a woman will be diagnosed with breast cancer. Is this because biopsies, by invading the tissue, increase the risk of cancer? Is it because any atypical mass, even if it is benign, means something has gone wrong? Or is it because women who are at risk are more likely to have biopsies? Any of these? All? No one knows. In my case, it did not matter: It was clear that with every passing year and every test, my risk would get greater. For my doctor, a sweet woman around my age, with two small kids and an Ashkenazi Jewish background—a woman very much like me—it was a foregone conclusion. When I suggested she stop treating me like a cancer patient because I had no cancer yet, she responded: “You have no detectable cancer.”

  And then there was the question of the ovaries. Eleven years before my mother died of breast cancer, her beloved aunt had died of ovarian cancer. It was apparently linked to the same mutation—my mutation, which is associated with a vastly increased risk of ovarian cancer.

  But to say “vastly increased” is to say almost nothing. Ovarian cancer is a rare disease. I had met gynecologists who had never seen a woman with ovarian cancer.

  One of my newfound mutant friends referred to her “down doctor” and her “up doctor,” the physicians who helped her manage the risk of each cancer. I, too, came to feel like I had something like a bipolar disorder. In some important ways these two kinds of cancer are the opposite of each other. Breast cancer in young women is difficult to treat, while ovarian cancer is considered highly treatable. At the same time, while death rates from breast cancer have been dropping, death rates from ovarian cancer have stayed mortifyingly high: 71 percent will die within five years of diagnosis. This is because ovarian cancer is almost never found early enough to be treated effectively—while early detection of breast cancer is becoming increasingly common. Doctors, I found, tend to be terrified of ovarian cancer: Those who have seen it know that it makes them feel helpless.

  Ovarian cancer is like breast cancer’s poor neglected cousin. Breast cancer is everywhere. Actresses get it, singers get it, famous writers get it, and in August 2005 mobster daughter and television personality Victoria Gotti was accused of pretending to get it. In America it has been so glamorized and lionized that it sometimes seems like a rite of passage. Breast cancer is a modern disease—primarily because it is imagined as a disease that can be overcome. Ovarian cancer is a cancer like Susan Sontag described cancer: intractable, unimaginable, unspeakable.

  At a lovely dinner in New York City one summer evening people told me about a heartbreaking book by a woman who, they said, wrote her own death from breast cancer. One of my companions, an editor at the publishing house that had reissued the book, sent me a copy. It turned out to be The Furies, by Janet Hobhouse, who died in 1991, at the age of forty-three, of ovarian cancer. Both of my companions had substituted the cancer that has come to feel familiar enough to be less than completely frightening.

  In the book, Hobhouse describes discovering her diagnosis and being told of her treatment options: “I was told about surgery and chemotherapy and it all sounded like a course of beauty treatments and everyone was quite jolly about it. And yet I had seen what I had seen on my way up to the doctor’s rooms and I knew I didn’t want to be in a cancer hospital. It was a place that would take you out of your garden-party clothes, hide your lipstick and turn you into a gray, rumpled bedding. Once they had you, they took your colors away, put you in a world like early TV, black-and-white, reduced, fuzzy imagery on a tiny screen. I didn’t know what all the merriment had been about in that office; it felt like a trick, even my part in it, like Pleasure Island in Pinocchio before everyone grows ears and gets shipped off for the slaughter.”

  This was very much what I felt like. I had been energetically welcomed to a club that used the confident language of science and progress to usher me down a corridor that led, I knew, to a lonely, unglamorous, and even rather untechnological death from cancer. I did, I was told, have the option of getting out, or at least of sticking one foot permanently out the door, by getting all the potentially offending parts cut off before they went bad. Then my breasts and my ovaries would be sliced up and examined in every microscopic detail, and only then, if they were found clean, would I be allowed to rejoin the world of the living, to fear plane crashes or stray bullets or nothing at all.

  In the two or three months after my test results I lived through the MRIs and the biopsy, tried to get warm under a sun that did not comfort me, and spent hours upon hours staring blankly at the computer screen, unable to read the studies and articles that I knew should inform my decision. The data was not sinking in. I was not in denial, and I was not trying to escape from the responsibility I had laid upon myself. Fragmented thoughts of the mutation, cancer, and my surgical options spun incessantly around in my brain, but I was spinning my wheels. The decision was not coming; in fact, my mind seemed incapable of generating a single thought whenever I tuned it to the mutation frequency. Finall
y one night I had the idea of writing a series of articles about trying to make my decision. I pitched it to the online magazine Slate, where the editors liked the idea, and, positioned gloriously outside myself, if only for a few weeks, I set about trying to understand how people can make decisions in the age of medical genetics.

  Chapter 5

  A Decision at Any Cost

  I WAS GETTING my first lessons in using the medical system. Some of the experience would be familiar to me—and to most Americans—from, say, dealing with public school systems or with cell phone service providers. It was marked by that inimitable sense of smashing your head against a glass wall that arbitrarily separates you from a clear and obviously beneficial goal. What makes the experience of trying to procure medical help special, of course, is that you are trying to manage not a piece of technical equipment or even your child’s essential education—areas where missteps can generally be remedied—but your own singular and mostly irreplaceable body. Over the next couple of years I would live through several moments that overwhelmed my imagination with their absurdity. There was the time my insurance company refused to cover one of my breast MRIs but paid for the other—because only one of them revealed a lump. There was the time I called to get a necessary medical document, wormed my way through the labyrinth of extensions and keywords to talk to a real live person, who, upon hearing my request, transferred me, quite purposefully, back to the voice mail system. But in December 2003, after my first iffy mammogram, I was just getting started. If I wanted to get the genetic test, my doctor explained, I could circumvent the long line by signing up to see not the famous head of the testing program but one of her fellows. If I tested positive, my doctor continued, I could jump the line and go directly to the head herself.

  Judy Garber, director of the Cancer Risk and Prevention Program at the Dana-Farber Cancer Institute in Boston, was a kindly woman in her midforties. She wanted to know whether I was “finished” with my childbearing. I was thirty-seven. I had a six-year-old and a two-year-old who was still nursing. For most of the preceding two years my partner and I had been out of sync on the subject of another baby: When I agitated for one, Svenya argued in favor of waiting, and vice versa. But women in my family had had children well into their forties—in fact, my mother had had a miscarriage not long before her diagnosis of breast cancer—so I felt we could safely stay on this seesaw for a while.

  But now Dr. Garber and her associate Katherine Schneider, a former president of the National Society of Genetic Counselors, were demanding an answer, in their soft-spoken way. I balked. Svenya sat next to me on the couch, stone-faced, her hands stuffed into the pockets of her coat. It so happened that the night before, we had cooked and hosted a dinner for sixty people. We were both exhausted and hungover. Svenya had spilled coffee on herself in the car on the way over. Both of us felt like frivolous intruders in this world of grave and clear-cut decisions.

  This session was called posttest counseling. But it did not feel like counseling as I had imagined or experienced it: a slightly fuzzy conversation flowing warmly toward a conclusion that begins to feel obvious and, therefore, comfortable. Rather, it felt like an exam I was quite possibly failing.

  I later understood why. For the past three-quarters of a century the key word in genetic counseling has been nondirective. The philosophy of dispensing information without guidance dates back to the first scientists who began providing pregnant women with an assessment of possible genetic risk to their future babies. Decades before DNA was discovered, they drew on basic knowledge of the few diseases that were known to follow Mendelian laws (Huntington’s chorea and phenylketonuria were among the first such diseases; both are debilitating neurologic disorders, but in the case of the former symptoms do not usually set in until adulthood, while symptoms of the latter begin soon after infancy). Eager to distance themselves from the eugenics movement, which had dominated American genetics in the first quarter of the twentieth century, these early counselors withdrew from making value judgments about a woman’s decision to terminate or continue a pregnancy. Surely it helped that the first genetic counselors were academics—Ph.D.s more often than MDs—who could establish the sort of distance with the possible mutant in their care that a family doctor could never have managed. In addition, they had no definitive tests at their disposal, so their suppositions about potential risk to an unborn baby were just that: inferences based on a woman’s and/or her husband’s family history. Finally, they were counseling pregnant women in an era when abortion was performed at the discretion of medical practitioners, and while this discretion seems to have been wide enough to accommodate the fledgling field of medical genetics, neither the counselor nor the counselee was actually in a position to make the abortion decision.

  As time went on, things shifted, but the result, weirdly, remained the same: Abortion was legalized, but genetic counselors, who by this point were no longer men with Ph.D.s in science but women with master’s degrees in social work (the first specialized program for genetic counseling was created at Sarah Lawrence College in 1969), chose to underscore their neutrality with regard to abortion—now because it had become paramount to acknowledge the woman’s autonomy in making the decision.

  The term adopted by genetic counselors—nondirective counseling—was actually coined by the psychologist Carl Rogers in the 1940s. Rogers himself later changed the term to client-centered therapy, and these days his technique is known simply as Rogerian therapy. The crux of the approach is the belief that a therapist best serves as facilitator rather than guide, helping the client to unleash his own potential for positive change largely by reflecting his reality back to him. Rogers’s key belief was that the client, not the counselor, should set the eventual goal of the therapy. The underlying assumption, however, remained: A person went into therapy seeking change. In the field of genetic counseling, on the other hand, change was often impossible.

  The first predictive genetic test for adults became available in the mid-1980s. This was the test for Huntington’s disease (which used to be known as Huntington’s chorea). The test was somewhat unreliable—at that point it identified markers rather than the gene itself—and terrifying for the potential test subjects, since there was (and is) no treatment for the debilitating and ultimately fatal condition. The field of genetic testing did not really take off until the mid-1990s, when predictive testing for familial cancer syndromes—most notably for the BRCA mutations, but also for mutations that lead to colorectal and other cancers—became available. The rhetoric of genetic counseling remained largely unchanged, despite some counselors’ doubts: Nondirective remained the key word. And when it came to familial breast cancer, it was reinforced by the circumstance of largely liberal educated urban women being counseled by other liberal educated urban women, whose lingua franca was the rhetoric of a woman’s right to choose what to do with her body, even though they were now talking about breasts and ovaries, not pregnancy.

  But unlike prenatal genetic counselors, the genetic counselors at high-risk-cancer centers have a goal. The goal is cancer prevention: not to extend a woman’s life or improve its quality but, quite singularly, to prevent her death from cancer. And it is impossible for a medical professional to conceive of different approaches to cancer prevention as being equally valuable—the way a Rogerian counselor might imagine different paths to self-improvement. Genetic counselors know perfectly well how to prevent cancer. In their universe, the idea of pretending not to know the “right” answer becomes as elusive and misguided as the ideal of objectivity in journalism. In fact, it leads to the same results: Just as a journalist pretending to be objective merely hides his bias behind what passes for neutral language (but is really just stilted), all the while trying to steer the reader to the conclusion he believes, so the genetic counselor creates the illusion of neutrality by avoiding engaged conversation but still tries to get the counselee to do what is best for cancer prevention. So it was that the two nice women cocked their heads to the side
and pushed me to decide whether I was having any more babies.

  I stalled. The women moved inexorably toward their conclusion. Once I made up my mind, they said, and either had a baby or did not, I should have my ovaries removed. I balked. In the weeks between having my blood drawn and hearing the results, I had researched early surgical menopause. Its unpleasant effects, I had found out, included increased risk of heart disease, high blood pressure, osteoporosis, cognitive problems, and depression—as well as inelastic skin and weight gain, which in this context seemed downright frivolous to mention.

  I mentioned the big ones. Judy Garber said softly, “The payoff is keeping you here.” This was certainly not nondirective. It was also quite possibly not true. I had looked at many studies, most of which indicated that an oophorectomy (surgical removal of the ovaries) was indeed effective in preventing ovarian cancer and lowering the risk of breast cancer. But I had also found a single study that provided some statistical analysis and concluded that a prophylactic oophorectomy would extend the average life expectancy of a mutant like me by between 0.03 and 1.7 years. The authors of the study, who included one of the women sitting in front of me, concluded that the increase was significant. I understood that the numbers looked so small because they took into account the roughly 50 percent of women who would not have gone on to develop ovarian cancer in any case. I also understood that looking at statistics when attempting to make a decision about my lone life was not particularly useful—but still it looked like an absurdly small gain in exchange for drastically lowering my quality of life and giving up the chance of more babies. The payoff, it seemed, was not so much keeping me here as increasing the chances that I would die of something other than cancer—whenever that happened. I politely suggested I could just shoot myself tomorrow: That would prevent my death from cancer with a 100 percent probability. The joke remained suspended in the thin air between us and the counselors, and with it, our disengagement from one another was complete.

 

‹ Prev