The Longest Race

Home > Other > The Longest Race > Page 11
The Longest Race Page 11

by Ed Ayres


  In a way, I was even a bit envious. Wouldn’t it be great if, after this race, I didn’t have to go right back to my office on Monday?

  Hobbes, of course, lived too soon to appreciate those hunter-gatherer studies—as did, for that matter, the teachers who instructed my generation in school. So, our view may have been badly distorted by what our teachers did know—that farmers in medieval times lived a life that was, unquestionably, nasty, brutish, and short. But at least they’d had hovels. It was easy to assume that if prehistoric people had to live in caves instead of at least having thatch-roofed huts to call home, and had to go search for something to eat in a wilderness full of dangers instead of at least being able to grab a scrawny chicken from the yard, life for them must have been even worse. In recent decades, however, anthropologists confirmed that most humans never lived in caves to begin with, and that those who did tell a very different story.

  What the Hobbesians could not have known was that the life of a Paleolithic persistence runner-gatherer was evidently far less deprived and unhealthy than that of a medieval serf. “It’s almost inconceivable that Bushmen, who eat 75 or so wild plants, could die of starvation the way hundreds of thousands of Irish farmers and their families did during the potato famine of the 1840s,” wrote Jared Diamond.3 By medieval times, and continuing into the time of the Irish famine and then our time, the domestication of plants and animals had greatly simplified human food sources and drastically reduced the genetic diversity and nutrient content that nurtures health and immunity to disease. And as a result, the average heights of humans had been severely stunted with the rise of civilization. Skeletal remains confirmed that by the Middle Ages, average heights had drastically shrunk from Paleolithic times, by half a foot—to five feet, three inches for men and five feet for women. Civilization had enabled people to mass-produce food for the fast-growing population, but the reliance on agriculture to produce it had separated most people from their wild environs, and the farmed food was now far less nutritious than the wild food had been. It was the first step toward what we in the twenty-first century would call “empty-calorie” food—the kind of monoculture product that would later, as people also became less physically active, cause rates of obesity, diabetes, and heart disease to soar. Civilization had taken a face-plant, at least in Europe. That “worst mistake in human history,” said Diamond, had been “thinking human history over the past million years has been a long tale of progress.”4 Some kinds of progress have allowed for a gradual recovery of normal physical height in recent centuries, but continuing erosions of biological diversity—and our consequent vulnerabilities to degenerative disease and ecological collapse—have only worsened.

  Glancing at the cliffs across the canal, I couldn’t see the cave opening. I’d found photos taken by hikers and spelunkers, and I knew that while some of the openings would be on the cliff side, visible from the towpath, others—including Killiansburg—were apparently farther up over the rim, hidden by woods. I thought of that more famous cave, called Lascaux, which was also well hidden when first discovered by twentieth-century farmers in France, but when explored revealed astonishing evidence of what may have been the very rich life of prehistoric people—not rich in the powers and products of technology, but in spirit, wonder, and appreciation of the world in which they hunted and gathered.

  The paintings on the walls and ceilings of Lascaux are mainly of large animals—the very animals that persistence hunters would have pursued and killed. There are horses, deer, wild cattle, wild cats, and a lone black bear. The rooms and ceilings on which they’d been painted had been given evocative names by their discoverers: the Hall of the Bulls, Chamber of the Feline, Panel of the Chinese Horses, Ceiling of the Red Cows, and Frieze of the Small Stags.

  Beyond the artistic exuberance of the paintings (one room had been called the “Sistine Chapel of Prehistory”), two things particularly struck me about this cave. One was that the animals depicted were not particularly indigenous to France as we now know it, but seemed to evoke far-flung parts of the world. Either the people who made those paintings had traveled prodigious distances by foot, or climate change and, later, the coming of civilization, had wiped out much of the wildlife that once thrived in this region. Or, most likely, the people who lived then had traveled from afar and the wildlife of the region had since been largely decimated.

  The other thing that impressed me about Lascaux was that humans were not the featured attraction. The artists’ main interest—the object of what had obviously been great awe and reverence—was the big game. That was a telling phrase, I thought: the big game, which is not just about large animals, but about the nature of our species’ life on earth. Millennia later, when civilization arose, artistic expression would become brazenly human-centered, and Michelangelo’s Sistine Chapel would depict our Creator in human form—and humans as the favored beneficiaries of that creation. Maybe it was inevitable, after the building of walled cities and systemic separation of people from the wild world that had long been our home, that we would begin to forget—to unlearn—how dependent we are, for every bite and breath we take, on the thousands of other life forms that are now struggling to survive in the places we have not yet plowed or bulldozed.

  In twentieth-century America, maybe in an instinctive as well as educated reaction to the Industrial Revolution and its escalating effects, a wide range of radical thinkers began identifying human-centered views of the world as having become, ironically, the greatest threat to the human future. “A focus on anthropomorphism as a major root cause of the ecological crisis had been made by Thoreau, Muir, and more recent writers such as Robinson Jeffers, D. H. Lawrence, Aldo Leopold, Aldous Huxley, and Loren Eisley,” the philosopher George Sessions recently noted.5 Their work launched the modern environmental movement, a guiding premise of which was that we humans cannot dominate the earth for long, and that we will need to rediscover its nature—and our own—or risk losing it.

  Modern runners, perhaps especially in our migration from the roads to the trails over the past twenty years, were perhaps beginning to do some of that relearning, even if we didn’t think of it in those terms. As Paul Shepard wrote, “Men are born human. What they must learn is to be an animal. If they learn otherwise it may kill them and life on the planet. It is very difficult to learn to be an animal. First man must unlearn his conception of an animal as a brute.”6

  The paintings at Lascaux seemed to leave signs of a society in which neither the animals nor the humans were seen as brutes in the sense Hobbes meant. Among the scores of paintings, none depicted a hunter triumphantly bringing down his prey or lording it over a fallen beast. None offered the Paleolithic equivalent of a moose head on a hunting-lodge wall. The animals commemorated by those paintings were not trophies, but beautifully alive, and I felt sure they had not just been lugged home and devoured, but revered and thanked for giving their lives so that the hunters and their families might live. There had been no psychological separation between the acts of killing an animal, appreciating it, and eating it.

  By the time of industrial agriculture, the separation brought by domestication—not only of plants and animals, but of us, the humans who’d accomplished it—was more or less complete. To a twentieth or twenty-first-century American, a hamburger is not the remains of a steer. The animal isn’t what you think of when you stand in line at Burger King. Bacon is not a pig, and “wings” are not pieces cut off of a bird. Our disconnection from the sources of our food is part of a larger disconnection that now endangers our whole future. The poet-farmer Wendell Berry, in his 1977 book The Unsettling of America, wrote:

  The modern urban-industrial society is based on a series of radical disconnections between body and soul, husband and wife, marriage and community, community and earth. At each of these points of disconnection, the collaboration of corporation, government, and experts sets up a profit-making enterprise that results in the further dismemberment and impoverishment of the Creation.7

  Up
ahead I spotted mile post 73 and struggled to do the calculation: Post 70 had been about mile twenty-eight of the JFK course, just past Antietam, so this should be about thirty-one miles into the race. Oddly, although running always stimulated my imagination, it somehow made even the simplest arithmetic more difficult. Left brain/right brain? I have no idea.

  As I lifted my arm to check my watch, a young Marine pulled even on the left, matching strides with me. He glanced at my elbow and leg, and said, “You’re bleeding, sir.” I laughed, and then I thought, Good God, I’m not just losing water through my pores, but through my knee! I was also a little too woozy (why the stupid laugh?) and knew I must be getting dehydrated. I’d need to do some serious drinking at Snyder’s Landing.

  At the same time, I was feeling another lift. For one thing, thirty-one miles is fifty kilometers, so I was now moving into true ultra territory. For another, evidently not all of the Marines had been ahead of me after all. And right now it was high noon, time for a showdown! My better angels cautioned me, though—I’d need to be patient for another couple of hours at least. And I didn’t carry a gun, like all the cowboys in High Noon did; I just carried a water bottle. The only animals I’d ever killed were a copperhead I almost stepped on once, about two feet from the door of my cabin, a rattlesnake that got too close to Elizabeth when she was six, and one groundhog. The snakes I had killed with a shovel, with great reluctance, and the groundhog with a rock, in a sort of accident.

  I felt bad about the groundhog. It was the summer of my nineteenth year, and I was out for a long run in the Green Mountains of Vermont on a trail I’d never been on before. It would soon be dark, and I wondered if I might be lost—and felt a trickle of adrenaline in my gut. Just then, crossing the trail about ninety feet ahead of me, I saw a groundhog. Without thinking, I picked up a stone about the size of a baseball and threw it at the animal, of course assuming I’d miss. I’d been a pitcher in Little League one summer and hadn’t been very good at finding the strike zone. To my great surprise, the stone hit the groundhog on the head and it fell over, dead. It was as if running had put my whole body, including my shoulder and arm, more in synch. Also without thinking, I picked up the animal and carried it back to the camp where I was staying. The next morning I skinned it, imagining that I was an Indian and was obligated to make something useful out of the pelt. But no one had ever taught me how to cure a pelt, and after a few days I had to bury it.

  Now, four decades later, I often wondered what it was like to be a persistence hunter—which, genetically, I still was. The University of Utah researcher David Carrier had provoked the curiosity of his mentor, Dennis Bramble, about that with his landmark “Energetic Paradox” article in 1984, observing that humans actually ran with lower energy efficiency than the animals they chased—yet prevailed in pursuits over long distances. How could that be? In subsequent years, Bramble and his colleague Daniel Lieberman, a professor of human evolutionary biology at Harvard, undertook a series of investigations to determine what anatomical and physiological features distinguish a vertebrate that primarily runs from one that primarily walks. Several million years ago, the earlier hominids had been walkers.

  Bramble and Lieberman identified twenty-six specific characteristics that separated the runners from the walkers, and found—much to the surprise and fascination of scientists everywhere—that in every one of the twenty-six categories, modern humans belong to the running group. At least one of those features, bare skin, was uniquely human. And as the investigation progressed, it became clear that in humans particularly, some of the other features that enhanced our running abilities—including the shape of our heads and structure of our necks—were intimately related to the function of our bare skin.

  I didn’t know it at the time of the 2001 JFK race, but Bramble and Lieberman were on their way to a momentous breakthrough in our understanding of human evolution—and indeed of human nature itself. The anatomical studies were compelling, but evolutionary science had long been wedded to the idea that humans evolved as walkers, not runners. The famous discovery of the early hominid “Lucy” had fascinated the scientific world with the revelation that this most ancient of women had walked on two legs. And she was anatomically adapted to walking, not running. The new anatomical findings needed to be reconciled with the hominid fossil record, and the breakthrough would come with an observation that—once it was articulated—gave the scientists an almost epiphanic clarification. Lucy had been a member of an earlier species of hominid, Australopithecus afarensis, which walked the earth for over two million years, long before Homo sapiens emerged. Lucy’s species had a considerably smaller brain than modern humans have and even over its million years did not begin to show the rapid development of modern humans’ ability to envision, anticipate, and persist—an ability that arguably originated with the complex endeavor of the persistence hunt.

  “The feature that differentiates hominids from other primates is not large brain size, but the set of characteristics associated with erect bipedal posture and a striding gait,” David Carrier wrote.8 The epiphany was that the development of modern humans’ bigger brain and the evolving strategy not only for surviving but for becoming conscious explorers, adapters, and manipulators of our environment coincided with the appearance of traits that enabled a transition from bipedal walking to long-distance bipedal running, and especially to running in a hot climate.

  “There were 2.5 to 3 million years of bipedal walking [by Australopithecines] without even looking like a human,” Bramble would write. “So is walking going to be what suddenly transforms the hominid body? . . . No, walking won’t do that, but running will.”9 Adaptation to endurance running, for example, altered the human torso to allow the shoulders to turn independently of the head, which allowed the head to remain fairly still even as the arms swing, thereby enhancing stability and balance—an important development for an animal running on two legs instead of four. Joan Benoit recalled in her 1987 book Running Tide that on the day of the very first race she ever ran, in a track meet at the age of eight, she watched an earlier event that made a lasting impression on her. It was an 880-yard run (half-mile) for teenage boys, and she was fascinated. “I was trying to figure out what made the lead runner look so good,” she wrote. And then she saw the answer: “His head . . . hardly moved—he used his energy to power his legs.”10 But, of course, even though his head hardly moved, his arms were swinging. She took that observation to heart and won her own race. And nineteen years later, she would win the gold medal in the first-ever Olympic marathon for women.

  Reflecting on the epic journey our species had taken since that prehistoric transition from the relatively unchanging life of Australopithecines to the running and then racing and then all-out sprinting life of Homo sapiens, I thought of the mission statement that Ted Taylor and my brother Bob had written three decades ago about the conundrum of ever-expanding human capability: “Technology, for all its impact on the world, cannot be ‘good’ or ‘bad.’ It can save lives and it can kill; it can intensify the pleasures of life or it can fill the world with misery.” It’s what we do with the technology, including agricultural technology, that makes all the difference. And it struck me that that’s equally true of what we do with our bodies, which were the original models for—and controllers of—our technologies. Over the past century, in the thrall of the Industrial and Post-Industrial revolutions, we had increasingly marginalized and even abandoned our bodies.

  President Kennedy had seen it, even if his picture of what had happened was still incomplete. Possibly the most egregious aspect of the abandonment had been the spreading belief that the good life is a life of ever-increasing convenience and ease, in which hard work is no longer necessary. But hard work (as distinguished from brutish work!) was what brought the early human hunter to the feast and to the celebration of his good fortune, and it was what inspired his art. And we were his direct descendants; we had the same genes and anatomy, the same deep drive to envision and pursue and pers
ist—and work hard, as long as we could breathe. And, of course, we had the same bare skin to cool the heat of our extraordinary metabolism.

  I thought of that familiar phrase, “No sweat!” which so aptly expressed how duped we’d been by the lure of an effortless, technology-augmented existence, and it seemed to me that that expression could be an epitaph for a species that loses its way. End war? No sweat! Lose weight? No sweat!

  Despite the wind off the Potomac, I could feel the sweat running down my face and neck. I really needed that refill, and, thank God, Snyder’s Landing was just minutes away. I eased up a little and was immediately passed by a navy kid and a slim young woman running side by side. I felt a bit of older-man’s envy, but also felt my spirits lift. Both he and she, like the Marine who’d passed a few minutes ago, looked quite liberated from the imprisoning stereotypes I’d grown up with: the hardened warrior with his heavy equipment, boots, and armor; the vulnerable female with her nylons and girdle and falsies and heels who couldn’t possibly do something like run a long distance. Even as recently as the early 1980s, the Olympics had not allowed women to compete in races that would last longer than about five minutes. These two looked light on their feet, relaxed, conversing amiably—just a couple of young people enjoying the day, in a world that might have a future yet.

  9

  Snyder’s Landing

  The Energy-Supply Illusion: Carbo-Loading, Body Heat, and Naked Skin

  At a lot of the aid stations I see in an ultra, I don’t stop. Or I pause just long enough to swallow a paper cupful of water and refill my hand-bottle with electrolyte drink—and then hurry back to the trail. I don’t want to rush but also don’t want to waste any time. At Snyder’s, though, I needed to stop. Support crews had difficulty getting to the aid stations along the towpath via narrow, unmarked rural roads on which it was easy to get lost, and I had asked Sharon not to try. I would cope.

 

‹ Prev