Book Read Free

Body Horror

Page 20

by Anne Elizabeth Moore


  BBC’s Orphan Black provides the most recent, and currently most watched, depiction of autoimmune disease in popular culture. An untold number of clones—some kind of corpo-government invention—are programmed with an endometriosis-like disorder to prevent pregnancy that, unfortunately, appears to spread and turn malevolent in the body of each clone. The search for the cure is a journey of self-discovery for each of the identical victims, of course, as well as a cloak-and-dagger intrigue replete with military, scientific, and private business interests all competing to conceal, nab, or supply knowledge about the clones’ diseases, genesis, and creators. That disease was programmed into the clones’ design seems a fitting acknowledgement of real-world conspiracy theories about autoimmunity; few medical, scientific, or holistic explanations make as much sense. (Some suggest, in fact, that autoimmune-adjacent Lyme disease is a government invention that leaked from a military water source, while the infamously contentious Agent Orange is, in fact, autoimmune in nature.) But that the diseases on Orphan Black are mysterious, suddenly triggered, serve the functional purpose of controlling a secret government invention, and appear (at least so far in the series) to be chronic does little to dispel the wider myth of autoimmunity as unfathomable, incurable, and even impossible.

  In sum, popular representations of autoimmunity are almost wholly consigned to sci-fi and comedy television—even though they’ve been acknowledged by the medical establishment now for decades. These are diseases of the future, the metaphors run. They are incomprehensible and ridiculous. And that’s where the metaphors stop.

  The impossibility of knowing autoimmunity to the same degree that we know cancer, it turns out, is a metaphor with a history. German physician and scientist Paul Ehrlich set the theory in motion in the early 1900s with his research into the immune system.

  Ehrlich’s scientific distinctions are many: he developed tissue-staining methods that allowed labs to distinguish between types of blood cells; he developed the first effective medicine to treat syphilis; he even initiated and named the concept of chemotherapy. In 1908, his work on immunology earned him a Nobel Prize. His immunological studies, however, led him to some fascinating conclusions. One, still surprisingly pernicious today, was that autoimmune disease could not possibly exist.

  Ehrlich’s 1898 attempts to immunize animals with the blood of other animals from their own species, and then with their own blood, failed to produce autoantibodies; that is, the blood did not appear to be rejected even when removed from and reintroduced into the body. “This led Ehrlich to postulate the existence of what he termed horror autotoxicus,” Arthur Silverstein notes in the journal Nature Immunology, “the unwillingness of the organism to endanger itself by the formation of toxic autoantibodies.”3

  Ehrlich’s precise judgment of the concept of autoimmunity, noted by Silverstein, was that it was “dysteleologic in the highest degree”4—fully purposeless, without function. (I recognize the validity of this argument, although the makers of Opdivo would certainly disagree.) When called on in later years to respond to emerging evidence that, however purposeless, autoimmunity did in fact exist, Ehrlich dug in his heels. Horror autotoxicus (literally, the horror of self-toxicity), he explained, did not prevent antibodies from forming against the self—the evidence he was given to examine—it merely kept them “from exerting any destructive action,” according to Silverstein, underscoring Ehrlich’s own vague explanation, “by certain contrivances.” Thus, Erlich dispelled the possibility of autoimmunity, at least in his mind, for good. Others paid note, too.

  “Ehrlich’s absolute dictum that autoimmune disease cannot occur would resound throughout the decades and prevent full acceptance of a growing reality,” Silverstein contends. As recently as 1954, despite scientific reports pointing to the emergence of at least six autoimmune disorders, horror autotoxicus was still considered a law, with holdings as absolute as the law of thermodynamics or those set out by Newton to describe motion. “The ruling immunochemical paradigm,” is how Silverstein more democratically refers to Ehrlich’s now disproven theory, noting that the possibility of autoimmune disease was still an open question in the 1960s and just starting to be addressed as a medical reality in the 1970s.

  How could an inaccurate (and frankly, wholly unprovable) theory hold such sway for so long? In short, the immune system was thought to contain within it a range of emotional responses—aggression, clearly, as well as acceptance—and one of them was “horror.” The sheer terror the body would naturally feel at the possibility of self-attack was presented as the medical reason autoimmunity was impossible. The logic of this must have seemed, at the time, unassailable. There is no biological reason for a human body to self-destruct—it is, truly, dysteleologic in the highest degree. I can barely imagine it, and my inability to imagine it should by all rights be mirrored in my body’s inability to perform it. Except that my right wrist and left foot are swollen and achy today—bodily evidence that Ehrlich was wrong.

  The idea of the immune system turning on its host is terrifying. It is one thing if the guards fall asleep on the job, but once your knights start invading your castle, what’s a king to do? Call a meeting to remind them of their duties? Throw the knights an appreciation party? The king would be lanced immediately. It is the combination of the terror of the idea, I think, and the easy accessibility of the medical theory to justify it, that mounting evidence of the existence of the autoimmune response throughout the twentieth century was ignored or considered aberrant to the “truth” of horror autotoxicus.

  Autoimmunity went unacknowledged for generations not because it was too horrible to consider, but because it was assumed to be so thoroughly horrible that the body couldn’t possibly allow it. We believed our biggest fears were impossible, and there’s something quite charming in that. As much as I’m aware that this is why no drugs have been developed that can keep my hand from hurting as I type, something deeply trusting lies at the base of this falsehood. It’s only too bad how much damage the falsehood continues to cause.

  The inexplicable and rampant growth of cells might be considered fascinating, if you have never seen the effects of it in yourself or others. Certainly, the cultural response is fascinating. We talk about beating cancer, run a Race for the Cure, buy pink to fund cancer research. “Fuck cancer!,” we say, and we mean it. These are narratives that we are familiar with and that we invest in, financially and emotionally. Color-coded ribbons and sympathetic head-shaving rituals are among the ways we indicate to one another: here is a thing that must be endured, and here is the manner in which I am connected to it. The metaphors may not always be apt, or terribly effective, but they act as scaffolding for further exchange. Even Sontag eventually conceded that such metaphors acted as “the spawning ground of most kinds of understanding.”

  Unlike the community that rallies for the eradication of cancer, prescription pills for the treatment of cancer are quite toxic; if you touch them, you must wash your hands immediately afterwards. Because ingesting a drug without touching it can look ridiculous, I sometimes explain to people when I toss back a few pills directly from the bottle, “These are chemo meds.” I say this to avoid looking like a crotchety pill-popping sitcom character knock-off, but my statement has no calming effect. Immediately, whomever I am with will look concerned, or sad, or alarmed. Until I add, “Oh, I don’t have cancer,” and then they look relieved and everything proceeds as if nothing unusual occurred. As if there was nothing wrong, as if there was nothing else to know.

  Autoimmune disorders are also fascinating diseases; they are less about the quantity of cells than their behavior. They do not create inexplicable growths—they incite cells to attack and to attack any available cells they can find. This is what makes autoimmunity the perfect response to cancer: an autoimmune system will attack until there is nothing left, and potentially well beyond that. Autoimmunity offers unrelenting but often invisible violence: the sick don’t usually appear ill, diseases may lie dormant for years before emerging
through symptoms, and tests are only administered once patients complain—or rather, once their complaints are taken seriously by presiding medical staff, which can sometimes take years.

  Little is known, thanks in large part to Paul Ehrlich, about why these diseases start, how they function, or what triggers them. What is known is that they cause disability, dysfunction, and death, by which I do not mean that they are all fatal—in fact, the number one cause of death of those who’ve received a diagnosis of autoimmune disease is suicide.

  The most significant source of frustration for the afflicted is how few effective treatments have been developed to respond to autoimmune conditions, which is why most of the drugs prescribed have other primary uses. One I took for awhile prevents malaria; another, that I do not take, is just gold, injected into the muscles. (I don’t know how this came to be a treatment for autoimmunity, but because I like ridiculous things and was willing to try anything, I requested it. My doctor unfortunately refused to administer it to me, saying it sounded, “Stupid, like it came out of a comic book.”) No one knows why most of the frontline autoimmune drugs work, in fact, because no one fully understands why the immune system goes haywire in the first place. The chemo meds, for example, are simply thought to throw the body into such extreme distress that it stops attacking itself. If you need these drugs, you quickly get the sense that you should feel lucky anyone bothered to find any treatments that help at all.

  The National Institutes of Health (NIH) funded autoimmune disease research at a measly $850 million in 2016, a drop from 2012’s $867 million, since which time diagnoses of individual autoimmune diseases rose between 2.5 percent and 6 percent per year, depending on the condition. Today, around fifty million Americans suffer autoimmunity, according to the American Autoimmune-Related Diseases Association, so an increase of 2.5 percent would mean 1.25 million new cases; an increase of 6 percent would indicate three million more cases.

  Far more funding, of course, goes to the study of effective cancer treatments. In 2016, the NIH dedicated $6.3 billion to cancer research, representing a healthy increase over 2012’s $5.6 billion. However, new cancer diagnoses didn’t increase much during that time—there were 1.6 million new cases in 2012, compared to 1.7 million new cases in 2016, according to the American Cancer Society. An increase of less than 1 percent.

  Note, please, that there’s not even enough funding to effectively track the number of new diagnoses of autoimmune disease on an annual basis. Yet even the most conservative estimates suggest that one in twelve people—one in nine women—will develop autoimmunity over the course of their lives. Only one in fourteen, according to the National Center for Health Statistics, will develop cancer, research for which was funded last year at nearly 7.5 times the funding for autoimmune diseases.

  In this way, understudied diseases that afflict some fifty million Americans are researched, although primarily for their potential to be used by the makers of such drugs as Opdivo to treat the cancers of some twenty million Americans.5 Sontag’s initial concerns clearly still hold true. The metaphors we use to describe illness limit our imaginations—not to mention the political force and funding pools required to develop effective response to rapidly spreading illnesses. But the mythology of cancer she explored was rich and elaborate—even if ultimately limiting—compared to an autoimmune disease, about which little to nothing is known.

  Cancer is known, while autoimmunity remains unknowable. Yet the unknowable has use, it seems, when put to service of the known.

  Part of this essay was included in a 2015 performance at the University of Illinois at Chicago’s Gallery 400 called “The Queer Crip Narrative.”

  I don’t want to go into it all over again right now, so let me just start by explaining that I almost died, like, eighteen times last year, so I pretty quickly got used to thinking about death in a functional way. Not as something through which I will personally be able to function—that would be ridiculous. Rather, death as an aspect of existence, as a likelihood, even as a future, um, event. One we may never “like” on Facebook, but that we will all attend anyway. More interestingly, I grappled last year with death so frequently, and in such a short period of time, that I started mulling over ways that it might hold value for me, now, while I am alive. Death as imminent, and not wholly unfriendly. Not to say that I am eager to personally embrace it as a state of being, mind you.

  What I am saying is that death—being inevitable, apparently—became at some point for me an intellectual banality, a scheduling concern, a daily consideration. Then, shortly after I became slightly bored by the concept, I got . . . intrigued. High school biology classes teach that the cessation of life is the first stage of a whole other process, one that starts with decay, and then becomes nourishment, which is life-giving. See where I’m going with this? None of this Live every day as if it were your last business. More like: Live every day as if, at the end of it, something new and exciting was going to happen. I started preparing for death, then, materially as well as emotionally. Not mine precisely, but in general.

  In other words, I started a worm bin.

  My move to a Bengali neighborhood in Detroit in May 2016 after more than two decades split between Chicago and various hotel rooms, couches, and flats around the world presented an opportunity to re-establish the foundation of my life without inconveniencing anyone else through divorce, marriage, or birth. I saw the house I was given as a reward for having won The Big Coin Toss, as an acknowledgment that my survival had neither been guaranteed nor would it prove to be permanent. So with little fanfare, I packed up my worms, drove five hours east, and dug my new foundation in the unsolid principles of useful decay. It seemed an appropriate metaphor for that particular moment in my life, after several serious health crises, as well as for the city that has come to symbolize urban ruination around the world. Decay not exclusively as death, then, but as both the end of life and the future life that ruination fosters. That’s the part politicians ignore while warning constituents away from becoming the next Detroit.

  I put seeds in the ground before I’d unpacked a single box and found that I had become a gardener. Not merely one who drops a couple beans in some front plots of dirt to see what will happen, although I have the utmost respect for folks who can keep their hobbies in check. I planted seedlings, invested in literature, and joined associations. I bought books—first one, soon a small library. I eschewed my work—writing, I remind myself sometimes—for projects like stump eradication and research into companion planting. Of course from these efforts naturally emerged the time-consuming lunch experiments, the driving inquiries behind which quickly evolved from “how can I best prepare this vegetable that I recognize and could purchase in the store?” to “this sort of looks like food and did come out of my garden, so I’ll just pop it in my mouth and see what happens.” What’s the worst-case scenario here, it could kill me? Runs a query through the back of my mind. Get in line.

  Most significantly, I crafted a compost bin from a couple concrete slabs in my back yard, and then another slightly larger one next to it. I daily fill one or the other with food scraps and cover those with a thin layer of soil. My chronic illnesses come with an array of food restrictions, so I’ve banned what I can’t eat from both my kitchen and my compost bin. Conversely, I’ve forged a pact with my co-conspirators in this life-cycle project—all microbes, insects, and worms—and have eliminated most of what cannot be composted from my diet as well. One should never feed anything to friends, single-celled or otherwise, that one wouldn’t consume oneself.

  I recently added a third bin, about two feet wide by ten feet long, for turning yard waste into usable soil. In sum, then, I manage three distinct plots of land, each devoted to meeting requirements for draining all vestiges of life from particular forms of organic matter for, ah, microbial re-use. There is more physical space in my immediate environment right now devoted to enabling processes of decay than to any of my other obsessions except “books.”

 
; My composting habits are not even limited to the yard. Indoors, I keep the worm bin, a two-tiered contraption that, in theory, allows the European Red Wigglers I ordered from the internet to feed on fruit scraps in the top bin, safely nestled in some wet-newspaper bedding, until their home fills with waste and I avail them to move house. Then I place bedding and food scraps in the bottom bin and the worms—again, in theory—make their way through the holes that I have drilled in the bottom of each container to their new abode. The notion is that the two-tiered bin allows me to make easy use of their leavings.

  In reality, however—and this is important to the process, I think—the “leavings” we are talking about are worm poop, and the worms like living in it. So when I require some “vermicompost,” I scoop it out and place it directly under a seedling. After snuggling that worm waste in next to some food I’m hoping will eventually happen, I pluck the worms from it, one by one, with my bare hands, and return them to their proper locale.

  The neighbor girls have screeched at this, their parents exclaiming out loud in Bengali at the sight of me digging worms out of some poop that I have just placed next to some vegetables, possibly because they know I will eventually try to get them to eat those vegetables with their mouths. I do not blame them. It is disgusting. There are ways to make the process less disgusting, of course, but I forgo them. They are distractions, I feel, from the process of ensuring that the beings I have enlisted in my agenda of useful decay are well cared for and enjoying their work. I can’t say for sure that I know when my worm friends are happy, of course, but they seem to like banana peels and being left alone in their bin. They are particularly important to the composting process: their waste is especially nurturing for young plants. I treat the worms with care, therefore, and give them only the foods they seem to like, which I gauge by noting how many of them crowd around certain scraps in a wriggling mass, the sight of which cannot be described as anything less than “stomach-churning.” Like the snake-pit scene in Indiana Jones and the Temple of Doom. It’s playing out in my kitchen right now.

 

‹ Prev