The Longest Race

Home > Other > The Longest Race > Page 12
The Longest Race Page 12

by Ed Ayres


  Generally, the aid station tables in an ultra, laden with stuff to eat, don’t appeal to me. In their promotional literature and Web sites, ultras often feature well-stocked aid stations as particular attractions: big spreads of bananas, candy, peanut butter and jelly sandwiches, oranges, Gatorade, potato chips, watermelon, pretzels, Advil, PowerBars, Clif Bars, soup, and, most ubiquitously, electrolyte drink of unknown provenance—the runner’s equivalent of that old summer-camp standby, “mystery meat.”

  These tables always made me slightly queasy. Right now, there was only one thing I felt like eating—a boiled potato. I found one, much to my relief, took a bite with the efficiency of a NASCAR driver getting a tire change, and got back into the race.

  Last evening, for my pre-race meal, I’d had a sandwich. Many of the runners had gone to the big carbo-loading dinner in the hotel’s banquet room, but Sharon and I, and Elizabeth, had decided to stay in our room so I could eat and go to bed early. I’m not a backyard-grill kind of guy, but I’m good with a loaf of whole grain bread. I made us sandwiches of aged cheddar cheese, organic lettuce and tomato, avocado, mustard and mayo, and fruit. To drink, we had unfiltered apple juice and spring water.

  When I was a kid in the 1940s and early ’50s, a sandwich like that would have been hard to find or even make. There were very few “health food” stores, as we called them in those days. The ones you could find sold mainly things like powdered kelp or bottles of cod liver oil or jars of brewer’s yeast. The bread they sold was heavy and dark—I think it was imported from the Black Forest, and a loaf of it was so dense it could have been used for firewood. My mother gave me sandwiches for my school lunch made with what she called “brown bread,” which she baked herself—not as dense as the health-food bread, but still not normal—and I was so embarrassed that I usually threw the sandwich in the schoolyard trash can rather than be seen with it. All the other kids had Wonder Bread!

  I had never dreamed, as an elementary and junior high school kid, that I would soon discover fresh-baked, organic whole grain breads that tasted way better than the cottonlike, bleached white bread I once coveted. (Or that I’d wake up to discover that my mother’s bread was really good, if I didn’t judge its taste by the “yuck!” reactions of other kids!) I never dreamed that by the year 2001 there would be a fast-growing chain of brightly lit, natural-foods supermarkets in most US cities, where you could find beautiful fresh fruits and vegetables that had not been sprayed with pesticides . . . and freshly baked whole grain breads that had no chemical preservatives, and peanut butter that was not processed with hydrogenated fat. The phenomenon of the urban natural-foods co-ops of the 1970s, with their tie-dye T-shirted managers and customers, to say nothing of the Whole Foods Markets and organic-foods sections of supermarkets that would follow a couple of decades later, were not yet on the horizon when I was a kid. But in 1955, my understanding of food underwent a radical transformation.

  My father, a very quiet and stoic man who rarely complained about anything, had been brushed off when he occasionally asked his long-time employers at AT&T if he could be moved to an office where he didn’t have to breathe the secondhand smoke of the other men (all of whom were chain-smokers in an era when a man who didn’t smoke might have his masculinity questioned). He finally succumbed to his chronic asthma and had to take a leave of absence. Somehow, he got connected with a radical doctor, a man who was regarded by the American Medical Association about the same way a woman believed to be a witch was regarded by the Puritan fathers.

  The doctor was blunt. If my father wanted to live much longer, he needed to do three things: Eliminate all highly refined foods from his diet, eliminate hydrogenated fats, and eliminate chemical additives. I don’t recall the name of the doctor, but today I am still amazed at how prescient he turned out to be—and how fortunate (or astute at doctor-choosing) was my father, who followed that radical advice and recovered well enough to live another thirty-five years and continue working at AT&T for another fifteen, without ever getting sick again. (When he died, at age eighty-nine, it wasn’t in a hospital bed, but in his favorite reading chair; according to my mother, he simply put his head back one afternoon and closed his eyes and was gone.)

  As it happened, I decided to adopt that diet too. It was very close to what would later be called a “Paleo” diet. The year was 1954, and Roger Bannister had just set the sporting world agog by running the first four-minute mile. It was also, for me, that pivotal adolescent time when a kid’s interests begin to change. I was thirteen and had discovered that I liked to run, and Bannister’s achievement captivated me. Although I hadn’t had any problem with asthma myself, I’d had a few bouts of bronchitis. And more significantly, as I listened to the maverick doctor talking to my father, something about what he said sounded right. The way Americans ate in the 1950s was an unfortunate change from the way people had eaten for thousands of years, he said. All that Coca-Cola, Wonder Bread, and Crisco was not natural. This was long before the word natural became part of the currency of food marketing and advertising, and the logic of it seemed to me unarguable. If refined sugars and grains could mess up your blood sugar and send you crashing in just hours, as the doctor said, what might they do over months and years? My interest in both avoiding my father’s debilitation and maximizing my potential as a runner galvanized my determination to follow a strict natural-foods diet from then on.

  Somehow I got it into my head that if this diet was good for helping a sick person get well, it might also be good for helping a well person get even more well—or for helping a strong runner get stronger. It was, you might say, my first introduction to the concept of “quality of life,” and without any prompting I joined my father in adopting the new diet. By the morning of the 2001 JFK 50 Mile, I hadn’t consumed white sugar, white bread, or hydrogenated fat in forty-six years. And I hadn’t had a soft drink in my life, since my mother with her brown bread and organic gardening thing had never approved of sodas, and I had just never gotten started. And by 2001, I’d gone about thirty years without meat as well. Maybe the hunter in me, having figured out that I no longer live in a world of vast wilderness, had learned to sublimate the kill. My spear-thrust now would be my final kick to the finish.

  When I went to a natural-foods co-op in the 1970s, I loved the spirit of the kids—it was mostly kids—who were both the managers and customers. The one I went to most often, in Washington, DC, was called Yes! Cities like New York, San Francisco, Boston, Philadelphia, Austin, Seattle, and Portland all had one or more of these places, and the proprietors were usually people who made conscious connections between what they were selling and how they wanted to live. I thought it was no coincidence that the managers of the dominant supermarket chains didn’t seem to make such connections at all—not even now, four decades later. Sometimes I’d walk into one of these chain supermarkets and find myself wondering how it can be that those managers can blithely sell cigarettes just a few feet from the fresh vegetables, or products loaded with hydrogenated or “trans” fats right next to products advertising their “fat free” ingredients. An in-store pharmacy will dispense obscenely expensive weight-reduction and heart-care drugs a few feet away from displays of candy, soft drinks, and hundreds of obesity-inducing and arterial plaque-building products. Don’t the executives of these companies see the disconnect? Do they really believe it’s just a matter of “personal choice” whether you get healthy or accidently kill yourself? As Wendell Berry said, the disconnection was omnipresent.

  But that, in fact, was the essence of the problem our sprint culture brought us. We had created an economy based on thoughtless disconnection. That way, the supermarket manager, who was probably a nice guy, didn’t have to think about the eventual effects of the products he sold—either the effects of how they were produced or of how they’d be consumed. The economy of disconnection, like a classic pyramid scheme, achieved its profits by fragmenting information so that none of the participants see the scheme as a whole. In a Ponzi scheme, ne
w investors don’t see that their money is being fraudulently used to pay earlier investors, rather than used to grow their wealth. In the sprint culture, the newest apps-happy American teenagers didn’t see that their cell phones, fried chicken, and sports shoes were often being paid for by the lives of other teenagers trapped in sweatshops or caught up in resource wars in Congo or Nigeria or Brazil, and that sooner or later the demons loosed by those wars might come home to roost. What had happened on a regional basis in places like Sudan or Afghanistan or Kosovo, or other internecine conflicts, could continue to metastasize. Antietam could be back, shape-shifted as Orwell predicted, vastly larger as Thomas Malthus and Paul Ehrlich and Lester Brown all warned, and with unthinkable consequences as Ted Taylor and my brother Bob and Mikhail Gorbachev all worried.

  I glanced at my watch and felt another trickle of anxiety. There were a good seventeen miles yet to go, and although I was still on target—barely—to break eight hours, I was pooped. Both my bowels and my stomach were empty. Energy flow is dependent on a continuous process of fuel input and waste output—whether in a single person or a whole civilization. But in both realms, I knew, there was an entrenched misconception—the idea that it’s all about supply. In American industry and policy, energy input was virtually defined as supply. The country’s response to the oil crisis of 1973 had been to do whatever was needed to ramp up supply—whether by drilling for more oil in Alaskan wilderness, lopping off more West Virginia mountaintops for coal, or flexing more military muscle in the oil-rich Middle East.

  The supply-equals-input equation had understandably shaped beliefs about energy inputs in high-energy sports as well. In 1965, a professor of medicine at the University of Florida, J. Robert Cade, concocted a mixture of sugar and electrolytes he called Gatorade, which in the early 1970s fueled a boom in the athletic energy-supply market. In the 1980s, a distance runner named Brian Maxwell started selling a product he called PowerBars, which he later sold to the Nestle Corporation for $385 million. By the 1990s, energy bars and drinks filled whole shelves in supermarkets.

  Among physicists and engineers, however, it wasn’t at all that simple. A nonprofit research group, the American Council for an Energy-Efficient Economy (ACEEE), conducted a breakthrough study showing that three-fourths of America’s increase in industrial output over a thirty-eight-year span had been provided not by increased energy supply but by increased energy efficiency. As a long-time runner, I wasn’t taken entirely by surprise. While the performances of elite runners had improved over the years, they hadn’t improved nearly as much as the consumption of athletic energy drinks and supplements had. When I’d won this race in 6 hours, 4 minutes in 1977, there had been very little in the way of aid stations, and I had consumed only a modicum of homemade energy drink handed to me by Sharon. Yet, last year, in this era of abundant energy supplements, only one of the 703 finishers had run faster on this course than I did twenty-three years earlier.

  The now-too-neglected secret, I knew, was that the body’s output, like industry’s, was more strongly determined by energy efficiency than by supply. Both were needed, but efficiency was by far the bigger factor. Elite runners had to know this, at least subconsciously. Dedicated training has the effect of increasing the distance a particular individual can run per hundred calories consumed. If an untrained man tried to run a marathon simply by relying on supply and consuming an energy bar every half-mile, he’d still probably have to be picked up by the meat wagon before he got halfway through the race. If a friend of his who had trained for months went the whole distance and did well, it would be because he was relying far more on the efficiency with which he used the energy he already had in him when the race started, than on anything he consumed along the way.

  The efficiency secret was little recognized in industry partly because the fossil-fuel companies had a huge vested interest in supply. Greater energy efficiency could reduce demand, which would be bad for fuel sales. But a more hidden reason is that most people seem to have only a sketchy understanding of what efficiency really is. Most of us know what it means, but not much about the science of how it is actually achieved, whether in an electric hot water heater or in an athlete’s body.

  A runner I knew, whom I won’t name because he might be embarrassed by what happened, had assumed what most people assumed: that energy efficiency is a simple ratio between input and output. He had run fifty-milers successfully a few times and decided to enter the famous Western States 100 Mile in California. In his logistical preparations, he thought he’d burn about 150 calories per mile, so he figured—very mistakenly—that he’d need to take in 15,000 calories. Ergo, he’d need to consume around fifty PowerBars. As might have been predicted, by the time he got to forty miles and had choked down about twenty of those bars against the protests of a seriously rebelling stomach (the guy wasn’t listening), he was both very sick and exhausted.

  The poor guy’s thinking was a perfect reflection of the way Americans have misunderstood energy in the economy at large. A salesman for gas-fired hot water heaters, for example, might tell a customer that his product has 85 percent efficiency because only 15 percent of the heat is lost “up the stack” and therefore 85 percent is going into the water. But a physicist would point out that the gas flame is much, much hotter than the heated water coming out, so a lot of the work that could have been done by that flame has been wasted. In other words, it’s the work output or loss, not energy output or loss, that should be measured. According to the first law of thermodynamics (the law of conservation of energy), energy itself can’t be used up or lost. What matters is the efficiency with which energy is converted from one form to another, whether in producing hot water, electric power, or the contraction in a runner’s legs.

  In 1972, a government consultant, Jack Bridges, using the same unscientific logic that the hot-water-heater salesmen used, presented a report to the US Congress asserting that the United States as a whole was using energy with an average efficiency of 50 percent, which he assured the politicians was so high that further gains from efficiency could not be relied on, and that the country would therefore need to build hundreds of new nuclear power plants by the end of the twentieth century. It was the same kind of thinking that led that Western States newbie to think he’d need lots and lots of PowerBars. But when my physicist brother Bob did his own calculation of US energy efficiency, based on actual engineering principles, later confirmed during his tenure as a professor of engineering and public policy at Carnegie-Mellon University, his US average came to roughly 10 percent. That left room for huge efficiency gains. And as the ACEEE study later confirmed, that was just what occurred over the three decades following the Bridges report—resulting in continued economic growth without any new nuclear plants being constructed. And even then, our industries had barely begun to scratch the surface of what they could do to boost output without additional input. Our economy, I thought, could learn a lot from a long-distance runner.

  I turned my head to look at the sun—a weak winter sun, really. This would be about as warm as it would get today. And that could be a problem. In a long-distance run, you’re always juking with balances, no less than a basketball player driving to the hoop through a fast-moving, defensive phalanx of opponents’ hips, arms, and hands. For a runner, it may be different balances, but—maybe surprisingly—it’s no less complex. As I’d reminded myself over two hours ago, If you want to make God laugh, tell him you’re going to run down the Weverton switchbacks fast without a scratch. The complexities keep bringing new challenges to those of us who run on two legs.

  At this point in the race, my challenge was just the simple motion of turning my head. In my training runs, I’d noticed a curious pattern: After a two- or three-hour run on trails, it was hard to turn my head—my neck was too sore! My routes all started with about a mile on a road before I got to a trailhead, and I had to turn my head to watch for cars when I crossed the road. No problem, of course. But coming back and re-crossing the road, hours lat
er, my neck would be very sore and stiff, even more than my calves or quads—which, in fact, were rarely sore at all. Sometimes, it was actually easier to slow down and jog a little half-circle on the shoulder so I could see down the road without having to turn my head all the way. Why?

  The answer, I eventually learned, was that the ability to turn the head, independent of the shoulders and torso, is one of those key traits that launched humans on the epic journey that led—for good or ill—to the invention of civilization. I don’t mean this metaphorically, although being able to look back in time may indeed be a prerequisite to being able to plan ahead. Here I mean literally. It’s like the old folk song, “The foot-bone connected to the knee-bone, the knee-bone connected to the thigh-bone, the thigh-bone connected to the hip bone. Oh hear the word of the Lord!” Neck-turn is anatomically connected to the ability to run long distances over rough ground; that ability is cognitively connected to the ability to anticipate what’s on the trail ahead; and the ability to anticipate is neurologically connected to the ability to envision, and plan, and invent.

  And just what was that connection between head turning and civilization inventing? After professors Bramble and Lieberman discovered that the modern human’s ability to swing the shoulders and arms while keeping the head fairly still was somewhat critical for enabling an animal on just two legs to maintain physical balance, it struck me that the converse must be true as well. On a rough trail, there might be hundreds of bumps or ruts that threw the body slightly this way or that, while the head and neck instinctively countered in order to maintain balance and stay on course. In effect, keeping the head steady in its orientation to the horizon while the torso danced left and right was functionally equivalent to repeatedly turning the head left and right while the torso remained steady. That ability to turn the head and maintain balance was essential to the persistence runner’s ability to chase down other animals, thereby reinforcing the abilities to endure and envision that were the precursors to invention and civilization building. Meanwhile, by the end of a two- or three-hour run on rough ground, the neck muscles had had almost as much of a workout as the legs.

 

‹ Prev