Body Horror
Page 12
That symptoms just as mysterious as Marian’s would become quite common in the decades that followed The Edible Woman’s publication was certainly unforeseen. But because mysterious feminine digestive troubles and invisible ailments have started to gain medical recognition in recent years, they have lost some of their comedic bite. In fact, “half-serious,” “dementia,” and “lying,” are only a few of the pejorative descriptors the Times lends a scenario in which a woman simply can’t eat whatever is put in front of her. It is only through the steadfastness of Atwood’s pen, it seems, that we refrain from considering Marian a wholly unreliable narrator. “[P]eople suck, in general, when it comes to accepting and respecting IBS as a real issue,” Rachel writes in a blog post entitled “Wedding Series: In Laws.” I’m apt to let her speak for the millions of other sufferers of understudied diseases. Today, people with similar symptoms are often passed from doctor to doctor for several years—the average is seven—before receiving a diagnosis. Many have watched as their symptoms are written off as psychological—the medical-industrial complex version of the Times’ “lying.”
The problem, however, might not lie in millions of women’s heads. A 2001 study at the University of Maryland found that, although women in general have lower thresholds for pain than men, and tend to experience it for longer, their symptoms are often treated less aggressively.6 In emergency rooms, for example, women wait for pain medication an average of sixteen minutes longer than men, and are 13 to 25 percent less likely to receive an opioid medication. A separate study in the esteemed New England Journal of Medicine showed that, among cancer patients, women were significantly less likely than men to receive adequate treatment for their pain.
Researchers remain baffled by these findings, although Hoffman and Taznian told lifestyle news outlet Mother Nature Network in 2015 that they’d come across a handful of recurrent explanations in interviews with medical staff. These include presumptions about men being able to manage pain better, and not complaining about it until it was “real;” some women’s ability to birth children, which is thought to imply an ability to manage any amount of pain; and women’s presumptive tendency to exaggerate and complain instead of describe sensation accurately.7
The world of medicine, in other words, tends not to believe women when they say they are in pain, which in the realm of invisible illnesses like Marian’s, is often the only symptom on offer. Fold in a host of known food-related disorders—anorexia, bulimia, etc.—and we can detect a general cultural tendency to label women’s problems with food as little more than psychosomatic, mere psychological issues undeserving of deeper study, never thought to be exacerbated by external or environmental forces, or indeed to be outwardly provable in any way.
Marian never describes the full range of her bodily failures, either to another character or to her audience. It’s a trick of Atwood’s, to force the reader to believe a female character’s version of events in order to follow a storyline. It also saves the character from the embarrassment that millions of other women have experienced, of being told—by acquaintances, loved ones, and doctors—that their illnesses are all in their heads.
You’ve heard of celiac disease, a diagnosis given to two to three times as many women as men, in which the body’s immune system attacks itself when triggered by the ingestion of gluten. Perhaps you’ve even reposted one of many articles claiming to debunk the wheat protein’s link to physical discomfort on social media, a sharable, friendly way to discredit women’s pain. (I know I have.)
However much the precise mechanics of the relationship between certain foods and autoimmune disease may not be understood, foodstuffs beyond gluten have been linked to certain autoimmune responses. Usually, autoimmunity is considered a medical mystery: for unknown reasons, doctors say, the body’s immune system turns on itself in the same way it would attack a parasite, virus, or other foreign invader. The resulting inflammation causes pain as well as physical impairment. Yet some health practitioners—naturopaths, in particular—root autoimmune disease in food sensitivities.
There are several food-elimination protocols suggested for the autoimmune, therefore: some name-brand, like the Paleo Diet, and others more tailored to individual responses, like the low FODMAP diet (to cut down on short-chain carbohydrates), elimination programs (to identify problem foods), low histamine diets (to quell allergic responses), and rotation diets (for those who can’t identify any specific food relationship to symptoms beyond ingestion). Such diets, however crazy they may sound to those who have not tried them, do tend to work for a large number of people.
Despite extensive anecdotal evidence, however, scientists have been slow to look into a relationship between consumption and autoimmunity. The reasons for this are likely myriad. While many in the post-Sanders campaign era would tend to blame Big Pharma’s exclusive focus on profits for the holdup, it may equally be due to the gender of the majority of sufferers. Although medicating illness is profitable, there are surely enormous profits to be made in staving off illness, if the price point for continued health is set high enough. It seems more likely that the tendency of the sciences to overlook autoimmunity is rooted in the low numbers of women in STEM (Sciences, Technology, Engineering, and Medicine). Women chemists, for example, make up only 35.2 percent of the field, and women chemical engineers only 22.7, according to analysis by the National Girls Collaborative Project. Nor are there enough women funders to ensure such studies take place: only 4.6 percent of the 2016 Fortune 500 CEOs are women, according to Catalyst, an organization that tracks women in business leadership. Women, earning on average 77 percent of what male counterparts do in the same jobs (specifically making $15,900 per year less than men in STEM jobs, according to a 2013 report by the US Census Bureau) do not currently have the economic clout as a class to fund or demand such studies. Anyway, those who perceive a clear need may be too ill to mount a campaign, or already wrapped up in the sustaining care work of blogging.8
Only recently, therefore, has a connection been proven between food and these poorly studied ailments. A June 2015 report in the peer-reviewed journal Autoimmunity Reviews found that common food additives contribute to intestinal leakage, which creates the conditions for autoimmunity. These additives are named in the report: “Glucose, salt, emulsifiers, organic solvents, gluten, microbial transglutaminase, and nanoparticles are extensively and increasingly used by the food industry, claim the manufacturers, to improve the qualities of food,” note authors Aaron Lerner, a professor at Technion Israeli Institute of Technology, and Torsten Matthias, of the Aesku-Kipp Institute in Germany. They also:
increase intestinal permeability by breaching the integrity of tight junction paracellular transfer. . . . It is hypothesized that commonly used industrial food additives abrogate human epithelial barrier function, thus increasing intestinal permeability through the opened tight junction, resulting in entry of foreign immunogenic antigens and activation of the autoimmune cascade.9
In plain English, consuming these additives leads to intestinal breakdown—commonly called “leaky gut syndrome”—which allows for the autoimmune response to occur.
The report cites the rise in autoimmune disorders—many diseases have three times the diagnoses that they did three decades ago—that are primarily occurring in nations with high rates of processed food consumption. Also noted are the specific disease categories with rapidly rising numbers of diagnoses (neurological, gastrointestinal, endocrine, and rheumatic), as well as the geolocations of the diagnosed, which seem to indicate that environmental, and not genetic, factors are the primary reason for the uptick in these ailments. The report describes a corollary rise in the use of food additives, intended to increase “the world’s capacity to provide food through increased productivity and diversity, decreased seasonal dependency and seasonal prices.” (In Brazil, for example, the years 1987 to 2003 saw a 46 percent increase in the intake of processed food in the average household. Virtually unheard of three decades ago, today Brazilian ra
tes of rheumatoid arthritis stand at around 1 percent of the population and incidents of psoriasis at 2.5 percent. Far more worrying is the Zika outbreak in the country, a virus that grew twentyfold between 2014 and 2015, and is linked to Guillain–Barré syndrome, an autoimmune disease that causes paralysis.)
To recap, the seven additives listed above are now being found in more foods. Those foods are eaten in more households around the world. And bodies in those households with previously healthy immune systems are becoming dysfunctional. Causality, the authors warn, “has not been proven”—the report only proves that these particular food additives contribute to leaky gut syndrome, from which the autoimmune response follows. “Precise mechanisms responsible for the development of nutrient-induced autoimmune disorders are unknown,” the authors contend.
Still, recommendations based on these findings are in order. “[I]ndividuals with non-modifiable risk factors (i.e. familial autoimmunity or carrying shared autoimmune genes) should consider decreased exposure to some food additives in order to avoid increasing their risk,” the report states. Additionally, strengthening FDA nutritional labeling standards—policies that have been designated “barriers to trade” by lobbyists and are being modified or dropped entirely10—and further studying the impact of food additives on immune systems are strongly recommended by the authors.
What this means is that individuals may be “going autoimmune” due to personal consumption habits. Yet autoimmune diseases in general—their worldwide spread, their increasing diagnoses, and their worsening symptoms—are likely triggered, at least in part, by the far-reaching machinery of globalized food production.
A food industry reliant on additives to ease its own spread throughout the globe has become central to a socioeconomic system based on private ownership of the means of production. McDonald’s stands as a shining example. In the decades since The Edible Woman first appeared, the chain has been documented using food preparation techniques from farm to table that are questionable at best and extremely dangerous at worst; exists now in outposts formerly hostile to Western presence; and—Marx would be impressed by the company’s allegiance to his definition of capitalism—exploits the workforce to such a degree that the fight to raise the minimum wage to $15 per hour is commonly sited at the burger chain. (The company had just opened it’s one thousandth restaurant and expanded to all fifty states when The Edible Woman came out; today there are over 36,000 restaurants in 119 countries around the world, according to the company’s own website.)
Michael Pollan describes a visit to McDonald’s in The Omnivore’s Dilemma, where he muses on the poultry-like meal his son orders. “[T]he most alarming ingredient in a Chicken McNugget,” he explains:
is tertiary butylhydroquinone, or TBHQ, an antioxidant derived from petroleum that is either sprayed directly on the nugget or the inside of the box it comes in to “help preserve freshness.” According to A Consumer’s Dictionary of Food Additives, TBHQ is a form of butane (i.e. lighter fluid) the FDA allows processors to use sparingly in our food: It can comprise no more than 0.02 percent of the oil in a nugget. Which is probably just as well, considering that ingesting a single gram of TBHQ can cause “nausea, vomiting, ringing in the ears, delirium, a sense of suffocation, and collapse.” Ingesting five grams of TBHQ can kill.11
The mainstream food movement in which Pollan plays a significant role has given us a contemporary understanding that the standard consumption habits of the Western world can be quite damaging to consumer health.
Of course, there is also a disease called consumption that in the nineteenth century killed as many as one in four Brits. Susan Sontag traced its earliest usage to 1398 in Illness as Metaphor. “Whan the bode is made thynne,” writes John of Trevisa, “so folowyth consumpcyon and wasting.”12
“Consumpcyon” was the common name given to tuberculosis, a disease that at one point was as mysterious and misunderstood as autoimmune diseases are today. Mysterious, but ever-present: the ill were described as langorous, hollow-chested, romantic, and pale. They appeared to be in the process of being consumed.
It’s a markedly different metaphor than can be applied to the autoimmune, who have less a disease in the traditional sense than a dysfunction. However much autoimmunity may be triggered by food consumption—and despite the fact that many can control the negative effects of these disorders to at least some degree by restrictive diets—the autoimmune are not being devoured by any malevolent, outside force. The bodies of the autoimmune attack themselves. In some cases, yes, eating away at it, over time—but the immediate symptoms of inflammation occur at the site of self-generated attack. In fact, what is markedly different about autoimmune disorders as opposed to any other public health crisis in history is that the whole language of “fighting” disease does not apply: the body must instead be soothed into remission, must learn to lay down its weapons entirely.
This is the logic behind restrictive diets, that certain foods trigger attack more than others. This isn’t an explanation that makes much sense to medical professionals, however. Doctors who don’t outright laugh when asked about connections between autoimmunity and food are few; others are more tolerant, although one often gets the sense that one is being humored. (“There are people who feel better when they don’t eat certain things,” is a common medical response, if an unhelpful one). There are only a handful of doctors in the US, however, who believe food to be a primary trigger in autoimmune disorders that aren’t situated in the intestines, like rheumatoid arthritis, lupus, and psoriasis. Even Lerner and Matthias’ groundbreaking studies on leaky gut syndrome have been slow to filter through the medical world.
In some ways, the reluctance of the medical profession to acknowledge that disorders suffered by one fifth of the US population are partially triggered by modern food production is understandable. Recent changes to the global supply and processing chain were made, as Lerner and Matthias acknowledge, to increase nutritional access and limit hunger. Admirable goals, to be sure, and certainly one could argue that slight suffering among the few is a reasonable price to pay in pursuit of increased health for the many. Yet such a justification can only hold for so long before changes become necessary to restore public health. Incidents of type 1 diabetes rose 23 percent between 2001 and 2009, according to the American Diabetes Association, and to significant effect: autoimmune diseases as a class are thought to shorten a patient’s lifespan by eight years.
Some have also found symptoms worsening and triggers increasing in number. Celiacs who have trouble managing their symptoms, for example, are now urged to avoid dairy as well as gluten, which some suggest is an indication that dairy proteins have begun to mimic the wheat protein. Others, already diagnosed, are simply accruing more diseases at a seemingly unstoppable clip, regardless of how well their original symptoms have been controlled through standard food restriction protocols, a possible indication that more foods trigger the autoimmune response than is currently suspected. On several concurrent fronts, the negative effects of globalized food production seem to be quickening at an alarming rate.
Marian solves her problem with a symbolic gesture: a small cake replica of herself that is offered for Peter’s consumption. He doesn’t want it, so she eats some—a miracle!—then offers the rest to another man. It’s a joke about consumption. Who has the right, and obligation, to consume? The joke is on Marian, for she must be consumed, however symbolically. The unremitting inevitability of consumption is, of course, capitalism. Yet the joke is also on women like her, real-life women like Rachel. For Marian’s symptoms went away following no medical intervention. Perhaps it was all in her head after all? The reader is left to wonder.
Not all readers, of course: millions of women around the world know that their bodies are failing not through any mental or emotional flaw, but because the system under which they live is causing damage. They feel it as clearly as Marian did. Today, bodies regularly grow intolerant of production lines, global distribution, and decisions made w
ith only profits in mind. Like Marian, women throughout the industrialized world are no longer capable of consuming what is on offer. Their bodies, too, are rejecting capitalism.
An excerpt from this essay was published as “The Planned Cow” in Women’s Review of Books.
I confess that I relied on the same party trick for nearly two decades: presenting as a well-educated, upper-middle-class white woman of able body, I’d engage in polite but witty banter with fellow revelers until the topic of The Future would arise, and what each of us desired from it. Then came my time to shine! It made no difference in what terms The Future was being discussed—political, economic, domestic—because in every scenario women are consigned to limited roles, and my trick hinged on this fact. When it was my turn to speak (I liked to build tension by pausing to apply a fresh coat of lipstick), I would reject the options presented me. “I want to continue doing exactly what I am doing,” I would say. I was single, writing, and traveling extensively. “I am happy.”
My declaration would first be met with silence (satisfied twenty- and thirtysomethings are apparently rare enough to stun). Then a well-intentioned and kindly voiced line of inquiry would emerge: “What about kids?”
“Kids?” I would say, gazing at a far corner of the room, as if pondering the existence of the younger generation for the first time. As if I hadn’t been challenged on my disinterest in motherhood the night before, the week prior, the month previous, and for several consecutive years and now decades before that in a relentless social rejoinder to my autonomy. “No,” I’d say thoughtfully, feigning consideration. “Not interested.” And I wasn’t.