This Is Running for Your Life
Page 1
The author and publisher have provided this e-book to you for your personal use only. You may not make this e-book publicly available in any way. Copyright infringement is against the law. If you believe the copy of this e-book you are reading infringes on the author’s copyright, please notify the publisher at: us.macmillanusa.com/piracy.
For my parents, and their parents
Contents
Title Page
Copyright Notice
Dedication
The Uses of Nostalgia and Some Thoughts on Ethan Hawke’s Face
The Dream (Girl) Is Over
Have a Beautiful Corpse
One Senior, Please
Beirut Rising
War and Well-Being, 21° 19'N., 157° 52'W.
Pixelation Nation: Photography, Memory, and the Public Image
Do I Know You? And Other Impossible Questions
The San Diego of My Mind
Ways of Escape
Acknowledgments
Copyright
The Uses of Nostalgia and Some Thoughts on Ethan Hawke’s Face
Let’s call it the theory of receptivity. It’s the idea, often cited by young people in their case against the relevance of even marginally older people, that one’s taste—in music or film, literature or fine cuisine—petrifies during life’s peak of happiness or nadir of misery. Or maybe it’s not that simple. Maybe a subtler spike on the charts—upward, downward, anomalous points in between—might qualify, so long as it’s formative. Let’s say that receptivity, anyway, can be tied to the moments when, for whatever reason, a person opens herself to the things we can all agree make life worth living in a new and definitive way, whether curiosity has her chasing down the world’s pleasures, or the world has torn a strip from her, exposing raw surface area to the winds.
During these moments—sleepaway camp right before your bar mitzvah; the year you were captain of the hockey team and the baseball team; the time after you got your license and before you totaled the Volvo—you are closely attuned to your culture, reaching out and in to consume it in vast quantities. When this period ends, your senses seal off what they have absorbed and build a sensibility that becomes, for better or worse, definitive: This is the stuff I like. These films/books/artists tell the story of who I am. There is no better-suited hairstyle. This is as good/bad as it gets for me.
The theory suggests that we only get a couple of these moments in life, a couple of sound tracks, and that timing is paramount. If you came of age in the early eighties, for instance, you may hold a relatively shitty cultural moment to be the last time anything was any good simply because that was the last time you were open and engaged with what was happening around you, the last time you felt anything really—appallingly—deeply.
I worry about this theory. I worry because it suggests that receptivity is tied closely to youth, and firsts, and also because as with many otherwise highly rejectable theories—Reaganomics and communism come to mind—there is that insolent nub of truth in it.
* * *
My worry started a couple of years ago, when I felt myself separating, effortlessly and against my better judgment, from what I had unwittingly been a part of for two decades: the Next Generation. It was when I noticed myself taking a step back from the yellow line when the R train blew into the station, rather than a step forward, the way I used to, the way the kids flanking me still did. It was when I began giving more than a passing thought to the age of the people around me—then much more, to the point where I find myself calibrating the age of new acquaintances as a matter of course, ranking a given group in a way that is new and troubling to me, not by interesting eye color or willingness to engage intelligently or suspected willingness to engage carnally or crack comic timing (not actually true; I will always rank by crack comic timing) and not even specifically by age but by age in relation to me.
For me it was the subway thing. For others it’s a first gray hair, death in the family, weeklong hangover, or the moment an indie sensation’s breakthrough single comes to sound like a family of flying squirrels recorded it in a cutlery drawer. I’ve never done it before so I can’t say for sure, but it feels like a peculiar time to greet the first intimations of mortality. Marketing demographics have put us all on a strict schedule, one that ties a person’s relevance to the years when he’s considered most receptive. It may be that Nielsen-style demos are the clearest terms we’ve come up with to gauge social standing between the onset of legal adulthood and retirement age. Taste and other less participatory cultural alignments have come to situate individuals in specific eras, dividing generations by band preference or favorite cereal or national disaster, and creating a powerfully unified sense of time’s passage that is otherwise pretty hard to come by. We’re especially primed, in passing out of that long stretch of peak market desirability, to reexamine our relationship to the culture, which means to examine our relationship with time. But modernity’s strange intermeshing of futurism and nostalgia has made time an elusive, sometimes contradictory source of information.
Looking to your peers for a sense of temporal equilibrium can be equally confounding. It seems to me that, between about age twenty-eight and maybe age forty-three, there now exists a gray area, where anyone within that range could be any age within that range. Rather than relaxing into the shared parameters of a condition traditionally known as adulthood, it is this cohort that gets the most freaky and pedantic about how old they are and how old the people within that same cohort might be. Age panic has an earlier onset but more superficial proportions; it’s more often tied up in the byways of vanity or status insecurity than given to the headlong realization that we’re all going to die die die.
Nothing makes the paradox of the way we now experience time more plain than clock-watching the end of youth. It would appear the clock has come to rule every aspect of who we are and what we do, from daily, micromanaged routines to five-year plans to deciding to marry while you’ll still look good in the pictures. And yet we are utterly careless with time, from passing ungodly stretches of it in a state of many-screened distraction to missing that part where you come into a sense of yourself as an adult reconciled with your own mortality and that of the people you love. Entering the prime of adulthood lends a new and dizzying urgency to the polarity of that relationship. Something important is supposed to be happening, but no one can quite say what, or when—only that it had better happen soon, ideally before it happens to that other guy. Unless it’s bad, in which case reverse that last part. Until whatever it is that’s supposed to happen happens, all we have in the way of orientation to a mean are numbers, and so we look to them.
The generation that felt some pride of ownership over the tech revolution is currently passing through this shadow demographic, eyeballing each other grimly as the teenage natives snicker over our digital accents. By thirty we had gone through four different music formats, which necessitated four different buying cycles, which brought about four different opportunities to revisit the question of where we stand on Alice in Chains. It all feels a little rushed, doesn’t it—the crucible of confronting one’s own taste and the terms on which it was formed? Somehow the decision to purchase Siamese Dream on iTunes—the old CD too scuffed for a laptop’s delicate system—seems fraught with the weight of a larger commitment, like renewing marriage vows, or making some more furtive, less romantic acceptance of the inexorability of the bond. Did I even choose Billy Corgan, or did he choose me?
For all the dorked-out debates about sound quality and texture and ear feel, the music is exactly the same; only time can clarify your relationship to it. But when that happens, we’re meant to feel sheepish about aligning ourselves with the past, as thoug
h, despite the fairly obvious contiguousness of our bodies and minds and excellent memories for Beastie Boys lyrics, there’s no viable, meaningful way to tie it with the present. And, more curiously, despite the pathological pseudo-nostalgic recycling that defines modern popular culture. But then maybe the theory of receptivity has pivoted its allegiances toward a technological sensibility, where content is content and it matters less whether you listen to Dead or Alive or deadmau5 than how you listen to the latter’s inevitable remix of the former.
In which case the question of taste and cultural alignment recedes, and examining our relationship with time means examining our relationship with technology—which has come to feel cultish, if not cultural—and vice versa. From a certain angle the entire digital shebang has consisted of dreaming up more and more sophisticated ways to contain and control time. Our devices offered themselves as second or even substitute homes, a place where time was both pliable and strictly monitored. We unfastened our watches, which suffered from always and only moving forward, and adopted new and advanced timekeepers that did whatever we wanted with time: tell it, track it, transcend it, save it, spend it, defy it, kill it, record it, recall it, replay it, reorder it, relive it. Very often they do all of those things at once, in the name of simply staying on top of it.
Keeping impossibly current has become the key selling point of smartphone connectivity. In recent ads for one network provider two men are delivered a seemingly fresh piece of news by a third party as they gaze into their devices. “That’s so [comically small amount of] seconds ago,” the pair smirk each time, to the third party’s great humiliation. To really get the jump on time requires dedication, and if that dedication looks like total enslavement to the uninitiated, let them enjoy the view from the pastures.
Moving at a pace that is not just fast—we’ve been fast—but erratic and discontinuous makes defining a self against some shared sense of time untenable. And yet we persist in seeking a binding cultural memory, a common frame of reference—perhaps out of habit, perhaps because it still feels like the best hope for a stable, unified identity. But modern cultural memory is afflicted by a kind of dementia, its fragments ever floating around us—clearly ordered one moment, hopelessly scrambled the next. It’s easy to feel as though upholding a meaningful continuum between the present and the past might require leaving the culture you mean to restore.
One result of this would seem to be our habit of leaning on fractional age distinctions and investing meaning in invented microgenerations, if only to get clear of the chaos.
People manifest this kind of thing differently. “Your friend mentioned her age a lot,” a friend of mine said after a dinner party recently. “It was weird.” He had just turned thirty-one and hated admitting it. My other friend had just turned thirty-four and couldn’t stop talking about it. “Maybe she’s trying to remind herself,” I said. The numbers only sound more unlikely as they mount. At a different party a few weeks later, I was sent on an El Caminian emotional journey when someone I barely knew asked me, in front of a roomful of people I didn’t know at all, how old I was. Whereas it would have been an ordinarily rude and possibly condescending line of inquiry even, say, five years ago, something about the question and the context had a political, almost passive-aggressive tang. Just what was it he wanted to know, or to tell me? He was thirty-four—older than I was, but only a little, as was everyone else in the room. I know this because most of them were television actresses whose heads swiveled in my direction as the question was being asked, not unkindly but with big, expectant eyes. I want to say—and suspect it could be true—that an actual hush fell upon us.
The dude was history. It was the actresses I looked up the next day. But then I find myself on IMDb fairly regularly lately. I am suddenly in possession of the ages of a number of actors and actresses whom I have been watching for years, in some cases decades. We’ve been the same age the whole time, as it turns out, it just never occurred to me to check, or to care, until now. Movie stars existed in a distant, ageless realm; even those cast to represent a specific stage of life weren’t confined by time in quite the same way. I remember watching E.T. and finding Drew Barrymore’s charming little-girl-ness exotic, as though she were showing me something I wouldn’t otherwise know. But we were almost the same age. I know that now.
One of the great, time-released pleasures of moviegoing is watching the actors of your generation grow older. Maybe pleasures isn’t precisely the right word—but maybe it is. With time comes the impulse to seek out evidence of accrued wisdom, pain, or contentment—the mark of experience—in their faces. This one had a baby; that one just lost her dad. Along with the R-train moment, for me it was watching Ethan Hawke in Before Sunset that left no doubt: this thing was really happening. Life had begun to show itself as more than a series of days, or movies, all in a row, which I might or might not attend.
In Sunset, set and shot nine years after Before Sunrise’s slacker riff on one enchanted, European evening, the characters played by Hawke and Julie Delpy reunite for a late-afternoon walk through Paris, delicately stepping around the last decade’s worth of disappointment and longing. Perhaps the one striking formal difference between the two films is that Sunset takes place in real time, where Sunrise uses elliptical fades to tell the story of an entire night spent wandering around Vienna. Time had become more present, and the present moment more urgent.
Loping through the Latin Quarter in 2003, Hawke appears gaunt and slightly stooped and basically body-slammed by time. But it was his face—with its rough skin, scored forehead and sunken cheeks, and, especially, the deep, exclamatory furrow wedged between his eyes—that transfixed me. Some said he’d come through a divorce, and it had taken its toll; that’s what life does to people. I’d heard about such things but never seen it rendered so plainly, and on the face of someone only a few years older. It was shocking, even a little horrifying. And yet so marvelous to see, so unexpectedly righteous and true. Testify, Ethan Hawke’s Face, I thought. Tell it for real.
If they last long enough and have earned a large enough share of our hearts, movie stars are often cued to acknowledge time’s work on-screen. Traditionally, either a mirror or a younger character reflects the bad news, and we pause to consider it with them. At sixty-one, Katharine Hepburn gave herself a rueful once-over in The Lion in Winter. Forty-eight-year-old Marlon Brando was taunted by his teenage lover in Last Tango in Paris, “You must have been very handsome, years ago.” In Towering Inferno, Fred Astaire (then seventy-five) gets the former treatment and Paul Newman (then forty-nine) the latter.
Pauline Kael was galled by this kind of thing. “It’s self-exploitation and it’s horrible,” she wrote about Hepburn’s pantomimed requiem for her beauty. But then Kael didn’t foresee the coming rarity of actors aging normally on-screen; nor, of course, the futility of an actress fudging her age on IMDb. Neither character acknowledges Hawke’s transformation in Before Sunset, probably because flashbacks to the previous film, and his previous, almost unrecognizably vernal self, make the point more poignantly than a more direct reference could.
I must admit, I was never much of a fan. I remember finding Hawke too on the nose, somehow, too much the thing he was supposed to be—always an actor first instead of a living, changing, insinuating being, someone who demanded watching. Of the many things I failed to imagine back then, watching Before Sunrise, I could not have conceived of a future in which a reprise of his role would feel like an act of generosity. I could not have fathomed feeling so grateful to Ethan Hawke for lending his face to a handmade, jewel-cut meditation on what life does to people—a slow-cooked sequel to a film about those too young and smitten to be concerned about what life might do to them. And what was life doing to me? I worry.
* * *
I worry, specifically, about 1999.
* * *
The year didn’t register too broadly on my personal barometers—fairly crap, if nowhere near as crap as 2000. But it was an extraordinary year in
film. It was, I am prepared to argue, one of the greatest years for film in movie history, and certainly the best since I had been alive. Possibly the best year in the second half of the twentieth century, but there’s a two-drink minimum if you want me to summon the table-rapping righteousness I’d need to go that far. Making this case ten and more years later now, increasingly the rebuttal is written on the impossibly pink and pinker faces of my sparring partners: But you were really young in 1999. That’s the last time you felt anything really—appallingly—deeply. I call the theory of receptivity and now find you slightly sadder than before.
But it’s just not true. I was young, yes, but I was a terrible young person—an embarrassment to my kind, really. And 1999, for me, was a nonstarter; hardly a time, I think, when I found things new and exciting because I was young, or that I now associate with my new and exciting, young world. If anything, I felt old and worn-out and generally skeptical, and it just so happened that the only thing I was good at was spending a lot of time alone, in the dark. I never saw more movies in a single year, it’s true, but it was my great good fortune—and I remember thinking at the time, Can you believe this? Again this week?—that so many important directors of the last generation and the next one seemed to be cramming their best work into the final seconds of the century.
Why the confluence? Hollywood’s obsessive chartings account for little beyond box office, and even some critics get their narrow shoulders up about making lists of a year’s best and worst films. But perhaps patterns should form the beginning of a story, not an end. Perhaps a run as hugely, almost freakishly accomplished as 1999’s holds meaning that we can’t get at any other way. If they’re anything like Ethan Hawke’s face, anyway, the integrity of the patterns formed in our culture can at least remind us of exactly how many miles can be racked up over ten years—what a decade looks and feels like—which is handy information, especially for those who have not yet developed a sense of it for themselves.