For it turns out that the standardization of time was enormously controversial. Bartky describes explosions and people shooting out the massive town clocks that the railways had installed—monuments to a hated temporal uniformity—in protest of the stripping away of individual determination over when things were going to happen.
What people were protesting the loss of, and how they got around even on trains before the introduction of standard time, was talking to other people. Business owners were angered by standard time because a centrally installed clock meant no one had to step into their stores to inquire what the local time was. Unmarried young people were sad because eligible hotties passing through had no excuses to start conversation. And train porters—perhaps the most frustrated of all—saw full minutes shaved off of needed rest stops at several different points in their already long, overworked days.
Before standard time, in other words, when you went somewhere new, you had to get to know people to figure out how things worked there. The locals liked it. It seemed to work for everyone, in fact—except the astronomers, the railways, and the rich, who collectively found it easier to track planetary changes and fire people for being a few minutes late to work or for taking lunch breaks for too long.
The plaque itself2 commemorates Sunday, October 11, 1883, the “Day of Two Noons.” On that date, the plaque explains, astronomers at the Allegheny Observatory at the University of Pittsburgh transmitted a signal at noon on the 90th meridian, and railroad clocks were reset to it. The plaque was presented to the Continental Bank from the Midwest Railway Historical Society on November 18, 1971. Note that the plaque simply restates what I have already told you: the astronomers gave the signal, the railways capitulated to it, and the banks celebrated it.
A few years ago I accrued several debilitating diseases in a process some call “falling out of time.” I now function on crip time, which, to crips, means that we operate on a different schedule. We require more time to perform certain tasks than is usually allotted under the regimented, efficient system of standard time. The phrase is also used disparagingly. If you are invited to an accessible event, perhaps with ASL translators or requiring complicated maneuvers to allow wheelchairs entry, the able-bodied man sitting next to you may joke about crip time, by which he will mean that the event is starting later than he would like it to.
I should mention that I was born on a reservation in South Dakota, where we had a similar concept named Indian time, but described it otherwise. Indian time is the awareness that time—standard time in particular—is a construct of capitalism, and the doings of animals like people are not beholden to patterns of efficiency or imperialism. Nothing need proceed until the various spirits beckon them to convene, which is why I once spent four days waiting for a guy to teach CPR to my camp counselors, so I’m not saying that it doesn’t take some getting used to. Indian time is differentiated, in South Dakota at least, from slow time and fast time, because the latter refer to Standard Time Zones, that are sort of arbitrarily adhered to in certain regions of the state based mostly on whim. If one is late for a meeting, for example, one does not apologize by saying one is on Indian time; one says one has accidently set one’s watch to slow time. For if one is genuinely on Indian time, there is no reason to refer to other formulations of time, because they do not matter.
This same concept is called something else in Latin America, and in the Styrian region of Austria, and outside of Tbilisi in the Republic of Georgia. From what I can gather they don’t bother calling it much of anything in the provinces of Cambodia, because folks will just get to stuff when they’re ready, if it really needs doing, and actually, why would you care what needs doing and what doesn’t. Why don’t you have a nap? It’s hot out.
This is the basic principle at work in crip time. When you get sick, it becomes clear real fast when something doesn’t really need doing after all. I’m finding, more and more, that what I don’t need to do is calculate from any street corner how to get to my next destination. In fact, I no longer wish to be destination-driven at all. So, nearly twenty-three years after I arrived, I’m leaving the grid a second time. While it’s still technically true that I’m not getting any younger, I no longer care that I am getting older. In fact, a certain number of disease diagnoses in, I’ve learned to relish it.
To my friends in Chicago, stuck on this grid for a while, or a lifetime, I leave you this thought: standard time, as natural as it now seems, has only defined certain humans for 135 years. It was implemented by scientists, bankers, and railway owners to ease their workloads and tax yours, in a decision that kept people from interacting with each other, from getting to know each others’ needs and interests. Standard time need not last forever. Before its implementation, there were local times, and before that, crip time, or Indian time, maybe slightly different notions with separate sets of values distinguishing how we, as individuals, might prioritize our days. Nonstandard times still exist, everywhere, and serve to remind us that we matter as individuals, that our sense of well being and personal abilities may not be best served by zones, alarms, deadlines, or grids.
I’ll leave you with this thought, sent by a friend in Uganda, where certain celebrations, like weddings, parties, and graduations, start late on purpose. The things that really matter deserve your patience, and starting on time might signal to the audience that they do not truly deserve the experience of the event.
“There is so much value,” Asia writes from her home in Kampala, “in subverting standard time in big and small ways.”
An excerpt from this essay was delivered at “First Time” at the Miss Spoken Reading Series at the Gallery Cabaret in Chicago.
Already, bacteria are the stuff of nightmares: creatures invisible to the naked eye, able to hand off distinctive features without the generational lag of biological evolution—oh, did you need a slightly more protuberous arm, or sharper teeth? Perhaps thin, pointed spikes that emerge from your core? Take mine, please—and are estimated in sum to number somewhere between two and a half million trillion trillion (on the low side) and five million trillion trillion (although that, admittedly, seems like a lot). Some bacteria are shaped like hot dogs, desiccated after too many days in the sun, and others resemble extremely hairy tampons with reactive, probing tails. There are spherical versions, too, with barbed talons or smaller, welty growths, and crumply ovoids covered in abscesses, seemingly pre-wounded. Add that they’ve been living on earth for somewhere between 3.8 and 4.1 billion years, and can be found inside your own body right now, and your pulse may quicken. But superbugs—bacteria that use all forementioned abilities, experiences, and tendencies to ward off antibacterial drugs—inspire a particular terror. Every substance designed to eradicate them only makes them stronger.
So you’re probably already quaking in fear over the pending superbug apocalypse, that not-so-far-off day when disease-causing bacteria develop genes to ward off all available and potential antibiotics, and humans become mere breeding ground for invisible foes. But if you aren’t already a practicing or even latent germaphobe, a quick peek at the news might turn you, real fast.
Reports of the superbug apocalypse have been hard to avoid: “The emergence of ‘superbugs,’ and the devastating threat of antibiotic resistance, is no longer a prediction. Last month, for the first time in the United States, a strain of E. coli resistant to colistin, an antibiotic of last resort, was found infecting a Pennsylvania woman,” read a late June 16, 2016, report from Boston.1 “Superbug Is a Wake-up Call,” echoed a headline in Pittsburgh early the next morning.2 Around the same time, Mid-Missouri Public Radio posted this item: “‘Superbug’ Found in Illinois Meatpacking Facility.”3 The same day, the health-focused Alternative Daily ran “Move Over Zika: A Superbug Hits Brazilian Beaches”—in case you thought the enemy was confined to domestic shores.4 In fact, the enemy is not confined at all. “One in three seniors is discharged from hospital with a superbug on their hands!” cries Newsmax.5
These are ju
st a handful of the dozens of articles on drug-resistant bacteria published within a single ten-hour stretch, culled here to exemplify the overarching public narrative (in case you somehow missed it): SUPERBUGS ARE COMING FOR YOU!
It’s far more than a media-constructed narrative, however, for the World Health Organization (WHO) issued a strongly worded warning about the perils of growing antibiotic resistance a few years back: “[T]his serious threat is no longer a prediction for the future,” it begins. “It is happening right now in every region of the world and has the potential to affect anyone, of any age, in any country.”6 The concern at hand isn’t merely that emerging and particularly dangerous bacterial infections can spread, either. It is that all previous infections, treatable and considered eradicated by modern medicine, will re-emerge. A WHO spokesperson elaborates: “The world is headed for a post-antibiotic era, in which common infections and minor injuries which have been treatable for decades can once again kill.”
The threat is hard to square with our daily, lived experience of the workings of the human body, a gloriously designed and fiercely self-protective object, far more complex than any single-celled organism. When an invader—say, a knife or a virus—enters a healthy body, that body will launch a defense that continues long after the initial slice or sneeze. Our wondrous protector, the immune system, labors subconsciously and immediately. Once antigens (the knife, the virus) have invaded a body, the immune system springs into action—first identifying, then attacking, and ultimately eradicating the invader. The process hinges on a built-in intelligence network, an antibody that molds to each new incoming threat. If a once-bested antigen returns to launch another attack, the binary code set off by an antibody match triggers a swift response. That response can be final—this is why most people only ever get measles once, and why flu shots work. The second time an infection or virus appears, in the form of an antigen your antibodies recognize, you may not even feel it. It is an exceedingly clever system, so brilliant, in fact, that medical science cannot fully account for its workings. And so effective that many living under the careful watch of such a powerful and intelligent protector have never bothered to give the mechanism of their guardianship any thought.
For an increasing number of people, however, learning how these protective networks function has become a central concern of their lives, if only because their own immune systems have turned against them. Autoimmune disease diagnoses have risen so dramatically, the epidemic is best understood anecdotally: fifteen years ago, before the American Autoimmune Related Diseases Association (AARDA) was formed, approximately sixty-seven diseases were classified as autoimmune in nature.7 Today, this number has grown to more than one hundred, and there are forty other disorders in the process of gaining recognition. This means that the list of autoimmune diseases gains a new entry, on average, every five months. The magnitude of the problem is therefore growing rapidly, as are incidents of individual ailments: long-established autoimmune disorders, including multiple sclerosis, celiac disease, and lupus, are rising as well. To give just two examples, diagnoses of rheumatoid arthritis in women have increased by 2.5 percent every year from 1995 to 2007, with the disorder today afflicting between 1.3 million and 2.1 million people in the United States alone.8 Meanwhile, rates of type 1 diabetes over the past four decades have increased 6 percent per year in children under four, and 4 percent in children aged ten to fourteen.9 Overall, the National Institutes of Health now estimate 23.5 million Americans to suffer from some form of autoimmune disease, although this figure is based on studies that include only a quarter of the currently recognized disorders. AARDA places the prevalence somewhere above fifty million—roughly 16 percent of the US population.
Simply put, autoimmune disease is a malfunction of the immune system. While a healthy immune system responds only to harmful, invasive antigens, an autoimmune system can respond to anything as if it were an antigen. Because the immune system is diffused throughout your entire body, this can take place anywhere. Really. Your own blood, the cells of the liver, the food you had for lunch—each, depending on the nature of the disorder and the systems affected, can be falsely identified as the enemy and attacked to the point of annihilation. Thus, the body might destroy its own lungs, its bones, its very heart. To inhabit such a body means living with severe pain, of course, but also debilitation and incapacity. And what’s to protect you, then—the immune system?
If you’re not alarmed yet, you should be. This is the real-life flipside to the superbug scaremongering that dominates news cycles with every new outbreak. Unlike superbugs, autoimmune disorders appear to have their foundation in genetics, tend to afflict more women than men, and aren’t typically diagnosed until a patient is late into, or just past, child-bearing age (most are diagnosed at forty or older). Thus, the autoimmune are passing along the potential for autoimmunity before they’re even aware of the malfunction, so diagnoses are expected to increase exponentially in coming years. And while there are records of patients experiencing remission for unknown reasons, there are no known cures, and few effective, safe treatments. Disorders that are autoimmune in nature, typically, get progressively worse over time; and for some patients, they multiply. Yet not a single article on the rising threat of autoimmunity was published on domestic news sites during the ten-hour period of frenzied superbug reports tracked above.
Autoimmune disorders are genuinely terrifying, and the lack of public knowledge of them, despite their increasing frequency, only contributes to the shock awaiting the newly diagnosed. Those afflicted may be advised, by smart mental-health experts, to avoid Internet message boards devoted to their disorders and not to name their diseases at parties. The former will surely induce an unhealthy level of panic, and the latter may elicit tales from unthinking acquaintances about aunts/mothers/sisters with similar conditions, who died in some spectacularly gruesome manner, perhaps at their own hand. That relevant audiences may be cautioned away from, or scared off of, one of the only public sources of information on autoimmune disease only contributes to the grand reserve of mystery surrounding autoimmunity. The lack of reliable information also does nothing to stem the nightmares, which may incorporate more mundane horrors as well as the outright barbarity awaiting patients in the doctor’s office. With standard treatment often comes drastic lifestyle changes to reduce stress: a strict eight-hour sleep schedule, daily yoga, a severely restricted diet (goodbye, gluten, soy, dairy, corn, sugar, and nightshades; farewell, alcohol). Steroids are common, with their ever-evolving potential for bodily horrors. On the more extreme side of the spectrum, doctors may prescribe weekly injectables known to cause debilitating side effects, sudden death, and even other autoimmune disorders; if patients eligible for biologics are lucky, their doctor will exercise caution and stick to a conservative drug regimen more commonly found on the oncology ward. Instead of being implemented temporarily, as they are when used to treat cancers, the autoimmune may be prescribed lower-dose chemotherapy drugs, for life. Even this may not describe the most extreme terror recited as common among the autoimmune: many with shifting symptoms or the most debilitating of effects are likely to hear, at some point or another, that doctors have no idea what is going on with their bodies. Therefore, top medical professionals may say, nothing can be done at all.
To underscore the point: nearly fifty million Americans are afflicted with diseases involving pain, inconvenience, frustration, lack of empathy, rage, misdiagnosis, fear, debility, and (far too often) death. The good news is that these diseases may offer the best hope for survival, come the superbug apocalypse.
To quell fears of succumbing to multi-drug-resistant bacteria MCR-1 or a particularly virulent strain of salmonella, we’ll need to look more closely at the human immune system, normally tasked with fending off such pesky critters. Rather, we’ll need to look at what is known about the immune system, and what has been assumed, and how much of this knowledge is now being discovered to be wrong.
The human immune system is so poorly
understood that it remains unclear whether autoimmunity results from a breakdown of the intelligence team or of the muscle pulled in to do the dirty work on its behalf. To get slightly more technical, the key component of the immune system is the leukocyte, or white blood cell, the avowed enemy of antigens both real and misidentified. Two types of leukocytes comprise the immune system, and they hide out in all the nooks and crannies of the body: phagocytes, which primarily attack bacteria, and lymphocytes, which develop into either B lymphocytes (the immune system’s command unit) or T lymphocytes (The Heavy), commonly known as “killer” T cells. (The metaphors used to describe the immune system are predominantly military, and thus position the population most afflicted with autoimmunity—75 percent to 90 percent female, depending on diagnosis; most middle-aged or above—as adolescent boys thrilling to such a display of force, power, and concision acting as our protector against the cold, cruel world.)
Body Horror Page 15