Book Read Free

This Is Running for Your Life

Page 20

by Michelle Orange


  Pixelation Nation

  Photography, Memory, and the Public Image

  History is embedded in every inauguration-night image of President Obama, but for me only one says it all. Three years later, the original of this particular image was hard enough to turn up that I briefly wondered if I had imagined it. Cropped for clarity, it would look much the same as what you’re envisioning now: Barack and Michelle Obama, the first black president of the United States in the arms of his black wife, smiling and slow dancing as they are serenaded by Beyoncé—the world’s foremost pop star, who also happens to be black—on a proscenium that seemed to have lowered from the sky for the occasion. It’s a campaign manager’s dream, the very picture of hope and change. At last!

  It’s the uncropped version, though, that vexed me. Granted, the margin of context in the Obama photo I’m talking about has more in common with, say, a moment-killing pan from Elizabeth Taylor and Montgomery Clift kissing in A Place in the Sun to the nearby grip wiping mayo off his shirtfront than it does the sinister element hidden behind Vanessa Redgrave in that Blow-Up shot, or Hitchcock’s camera showing us a knife rising behind a soapy, unsuspecting Janet Leigh. And yet, the scene beyond that proscenium seems like a pretty essential clue; without it you get a nicer picture but only half the story.

  But then as trained aesthetic consumers we prefer our defining public images well composed and to the point. For instance, were it not similarly cropped for clarity, the most notorious image of the torture perpetrated at Iraq’s Abu Ghraib prison between 2003 and 2004—of local community leader Ali Shalal Qaissi balancing on a wooden box with his arms outstretched, his fingers wired for electrocution, his head hooded and body draped in black cloth—might have made an even more horrifying impression. Edited out of the shot that inspired its own Banksy stencil and landed on the cover of The Economist below the words “Resign, Rumsfeld” is the schlubby outline of some guy. Standing in profile, maybe three feet in the foreground and off to the right of the hooded, electrified prisoner, some guy is a brush-cut brunet in belted khakis and an olive-green golf shirt. The wedding ring on some guy’s left hand is poised just above his gently thickened middle, and he’s peering down into a digital viewfinder of his own, as though he’s just taken a snap of his four-year-old twins posing with Pluto on the Magic Kingdom promenade and wants to make sure everybody’s eyes are open.

  On the morning after the January 20, 2009, inauguration, I was most struck by an image of the presidential waltz taken from deep in the crowd: Barack and Michelle embrace like lacquered wedding toppers in the middle distance; between our photographer and the first couple stand a phosphorescent crop of cameras, phones, and camera phones, all raised high in a kind of holy gesture of affirmation. The aliens might assume the cameras are part of a blessing ritual, glowing amulets bestowing good luck. That assumption would be close but ultimately too kind. The aliens would probably figure that out when they discovered the same ritual surrounding the fatal beating of a Chicago kid in junior high, or the gang rape of a Vancouver schoolgirl. Or when they got a load of some guy scanning his camera’s screen while a torture victim teeters nearby.

  If cameras were originally used, as Susan Sontag memorably put it, to collect the world, the atomic device known as the digital camera has more of a self-reinforcing quality, sucking a fluid moment in at one end and spritzing its owner with eau de permanence out the other. Whether the images are moving or frozen hardly matters anymore. That only a thumb-toggle divides the two introduces a kind of interchangeability; each one can become the other at your command. Especially when they are held out blindly in big crowds, the screens that have replaced the traditional viewfinder appear to function as a kind of second subjectivity, a third eye to cope with a world that is less often collected with any kind of discretion than amassed in daily reality dumps. So that to raise a camera is mostly to remind yourself: Right now I’m here; I’m here right now.

  I suppose it goes without saying that, even as I shook my head over the inauguration-night photo’s landscape of pale, Promethean torches, their same periwinkle shadows painted my face. Sitting at my laptop, I wondered what difference it made, when technology offers such persuasive surrogates for seeing the world, whether you experienced that night with the help of a three- or a thirteen-inch screen. After all, that was kind of the point of communication and broadcast technologies—bringing us together, eliminating obstacles of access, equalizing an experience or event. But images like that of the new president and first lady make me wonder at the thoroughness of the job. What difference does it make that I wasn’t able to actually witness this historic event when it appears the majority of the people who were there couldn’t quite bring themselves to show up.

  Is that unfair? Very well then it’s unfair. But even if we are to agree that inserting a camera between yourself and your immediate surroundings, or raising that surrogate eye, does not in any way affect the experience of those surroundings—does not swaddle you up in a sense of impartiality, or shift the burden of action—the question remains, What’s the deal with that? What’s the deal, especially at public events inevitably recorded by professional equipment and pinged instantly around the world, with the compulsion to add your funky G3 shooter to the mix? Are trophy pics even possible in a world where all is photographed?

  In the digital age, everything survives in an equal perpetuity, so that to experience the world through images is less and less to be rewarded by pleasure or insight and more and more to be afflicted with a kind of hysterical reality blindness. Some claim digital celibacy—we’ll call them liars—while others end their days with the numbed insensibility of a triage nurse on Flickr’s teeming front lines. Even the most cheerful digital creators and consumers are sometimes overcome by the odds of a race between infinite content and their two little eyeballs.

  But then getting eyes onto every image is no longer the point. Or at least, a post-Soviet case of inflation has caused any given image’s valuation to plummet. Those of us compelled to slog through every one of a friend’s seven hundred wedding photos, or each album of a weekend away, need not worry so much about hurt feelings. A complete lack of audience will hardly inhibit a steady upload stream, any more than it stops us from living our lives. For every personal photo disseminated through some form of media, dozens more are the result of pure reflex and languish until giga-space is needed for more like them. The act of shooting, not necessarily its smeared result, is now in many ways the point of photography, which has become more medium than message. I can only imagine that the bulk of the cell-phone videos shot on inauguration night now rest in unvisited digital tombs.

  * * *

  At the 2011 SXSW music, film, and interactive festival in Austin, Texas, the Q&A session that followed the world premiere of a documentary about a crappy year in the life of talk show host Conan O’Brien was lit in part by the audience’s hoisted iPhones. The energy of a room changes when this staggered, Lite-Brite wall goes up; we move from audience members to viewers, a seemingly minor but when you think about it kind of massive shift. It is as though—as we so often feel with celebrities and indeed as celebrities often feel themselves—there would be little point to an event that was not photographed. Instead of helping to create a moment, we insert a remove from it; instead of feasting our eyes, we make a formal claim on what they see.

  After ninety minutes of listening to Conan O’Brien bellyache about the hardships of high-stakes showbiz, I didn’t feel bad for him until he stepped out onto the stage of the Paramount, and a theaterful of journalists and partisan moviegoers lifted their phones in a kind of inverted salute. Tiny images of Conan and his director filled the theater like backward mirror shards; on a screen hovering next to me, he looked much farther away than the Gumby-legged figure a few rows off. Another MPEG to fatten up the old blog, I guess. Another thing that happened, if we still agree to the barest terms. It is considered more accurate and more interesting to say “another thing that happened to me.”
<
br />   All right, then. After all, in at least one sense it did happen to you. You flew to freaking Texas, where for five days in March junior Google techs attempt to spawn in Austin’s Red River District; you braved the melee at Madison Square Garden (“Put your cell phones away and put your cameras down,” pop minstrel Lady Gaga commanded during a recent—and televised—concert there, “’cause this is only going to happen once”); you murdered that plate of buttermilk fried chicken; you boned a B-lister in a Reno hot tub; you were the point man in a brutal and sustained frat hazing; your first flight was to Baghdad, and anyway at Abu Ghraib, as Ali Qaissi told The New York Times in 2006, “All the soldiers had cameras.” Things happen to us as they ever did. It may be just as obvious to note that the way we experience those things, and the way we then frame that experience, and the way that those framed experiences are remembered has changed.

  * * *

  The still camera’s earliest shills endowed it with the power to create memories. If such claims were to be believed—and it seems they were—the more photographs taken, the richer our individual and collective memories. The iPhone promised not only to create but enhance its owners’ memories: “If you don’t have an iPhone,” went a 2011 ad, “movies aren’t this dramatic, maps aren’t this clear, e-mails aren’t this detailed, and memories aren’t this memorable.” And, well—who wants shitty, unmemorable memories?

  As a marketing strategy, tapping into memory anxiety has only looked smarter as we develop more ways to record and transmit reality. The smartphone camera may be the ultimate cause of and solution to all such anxieties; from here on out, both can only be perfected. The way we relate to images reflects the two kinds of memory: systematic recall and documentation—these things happened in this order—and the strange, slow emulsion that brings the invisible ink of experience into clearer view. The first lends itself to searchable cataloging; the second is completely unpredictable. Neither is entirely reliable, though one would seem more likely to harbor meaning. Yet it is the former type of image—if not memory—that has flourished. If the digital camera, with its promise of perfect recall, both reminds and relieves the shooter of the burden of being present, the resulting images often have more of a social than a subjective or individual purpose. The most common modern image is consciously about display and dissemination, giving a public order to one’s persona and experiences. It’s more about representing a certain reality than remembering it, although looking through carefully curated Facebook albums one often senses the longing of the subject to remember herself the way she would have others do.

  The thing about memory, though, is that it’s like a beautiful woman: you have to pay attention for a shot at its full reward. Part of being overwhelmed by the surplus of whiplashingly lovely women stomping the streets of New York City is the contemplation of the world of attention that must be built around each one. Every indelible face is the center of its own ecosystem of enchantment, or bloody well better be. The whole thing can tucker you out in a glance. Facing the deluge of social-media images can feel the same: by their nature each one—especially the self-portraits—seeks a spot in your dreams, despite being designed to die a quick, mosquito-like death. The digital image has presented memory with a paradox: infinite but transient choice makes it both more and im-possible to remember well.

  Consider DailyBooth.com (slogan: “Your Life in Pictures”), a social network composed solely of photo updates—image statements that crawl across the bottom of the home page in real time. With few exceptions, the photos are classic laptop pics—pictures taken by the camera now built into almost every computer. These cameras have a slight fish-eye effect and bathe the subject in an eerie, deoxygenated blue. DailyBooth’s live feed is essentially a gallery of puckering young girls, each obeying an instinct with ever-expanding possibilities for exploitation. Faces pass as soon as they pop up, glancing bids for attention that seem frivolous in one light, crushing in another. Who will love them all?

  There’s something unnerving about a social network composed almost entirely of self-portraits, of kids pulling in their chins, pointing a cheekbone to the ceiling, and staring into a pinhole while their hand goes click. Within the first dozen or so photos, you have trouble telling the faces apart, so similar are the boudoir backgrounds, the spooky lighting contrasts, the flagrant moues. What at first feels revolutionary—dispatches from the private, teenaged sanctuaries where so much of what has defined the last sixty years of Western culture was incubated—begins to look more like what it is: a vacuum of random, repetitive self-exposure, the mutation of a process that was turbulent enough when it was conducted in relative privacy. Your life in pictures turns out to look a lot like your life at a computer.

  God knows I did hard time in front of the mirror as a girl. Had it been an option, I might have cleaved to the communal glass, where young and old now search for the features of a viable self. But I’m not sure I could have mistaken it for a social activity, and I feel certain that the mirror cannot be a source of memory.

  * * *

  Photography was conceived not to create memories but to record and represent beauty. Specifically nature’s beauty. “In many ways,” John Fowles writes in The Tree, his sharply unsentimental 1979 treatise, “painters did not begin to see nature whole until the camera saw it for them; and already, in this context, had begun to supersede them.” And yet the world’s forests and seas “cannot be framed. And words are as futile, too laborious and used to capture the reality.” We only truly encounter nature’s fearful symmetries, Fowles believed, by way of consciousness, the subjective foregroundings and recessions of response, memory, and imagination. We must submit to her before nature will slip out a shoulder.

  In this way, perception forms a kind of secret passage: the first, submissive step into an unrecordable reality is a first step toward the self. For Fowles, a certain quality of subjectivity defies sharing or re-presentation. It is unknowable the way the mind of an old, tail-whapping lion is unknowable. It is that unknowability which makes its near-penetrations—in art, in life—so terribly moving.

  “It, this namelessness,” he writes of the ancient, stunted oak trees in southern England’s Wistman’s Wood, “is beyond our science and our arts because its secret is being, not saying.

  Its greatest value to us is that it cannot be reproduced, that this being can be apprehended only by other present being, only by the living senses and consciousness. All experience of it through surrogate and replica, through selected image, gardened word, through other eyes and minds, betrays or banishes its reality. But this is nature’s consolation, its message, and well beyond the Wistman’s Wood of its own strict world. It can be known and entered only by each, and in its now; not by you through me, by any you through any me; only by you through yourself, and me through myself. We still have this to learn: the inalienable otherness of each, human and non-human, which may seem the prison of each, but is at heart, in the deepest of those countless million metaphorical trees for which we cannot see the wood, both the justification and the redemption.

  Leaving Wistman’s Wood, Fowles was resigned to the erosions of his own impressions: “Already no more than another memory trace, already becoming an artefact, a thing to use. An end to this, dead retting of its living leaves.”

  The Tree might seem like a strange memo from a fiction writer, whose business is finding a way to represent the world and what it’s like to live in it. But much of Fowles’s writing wrestled with the nature, as it were, of that business. In his most celebrated novel, The French Lieutenant’s Woman, a filtering of Victorian tradition through postmodern prisms seems to forgo reality in favor of a self-consciously literary world. George Eliot and the rest of the social realists Fowles was tweaking saw human perception’s lonely paradox as a challenge, not a red herring.

  In conceiving Dorothea Brooke, the heroine of Middlemarch, Eliot developed a style of near-microscopically descriptive realism to illuminate the ways that a woman can remain unknown even to
herself. Describing Dorothea’s maiden voyage to Italy, Eliot acknowledges the Fowlesian, fleeting nature of pure perception, but offers the curious yields of memory as a kind of compensation:

  The weight of unintelligible Rome might lie easily on bright nymphs to whom it formed a background for the brilliant picnic of Anglo-foreign society; but Dorothea had no such defence against deep impressions. Ruins and basilicas, palaces and colossi, set in the midst of a sordid present, where all that was living and warm-blooded seemed sunk in the deep degeneracy of a superstition divorced from reverence; the dimmer but yet eager Titanic life gazing and struggling on walls and ceilings; the long vistas of white forms whose marble eyes seemed to hold the monotonous light of an alien world: all this vast wreck of ambitious ideals, sensuous and spiritual, mixed confusedly with the signs of breathing forgetfulness and degradation, at first jarred her as with an electric shock, and then urged themselves on her with that ache belonging to a glut of confused ideas which check the flow of emotions. Forms both pale and glowing took possession of her young sense, and fixed themselves in her memory even when she was not thinking of them, preparing strange associations which remained through her after-years.

  Now consider the modern tourist making a quick swipe of St. Peter’s with her Flip cam and moving on. Or a young man interrupting a conversation with an unknown partygoer to Facebook-friend her on his phone, skimming her likes and dislikes for offense, noting their mutual acquaintances, and appraising her profile photo as she stands mute beside him.

  In her extreme youth, Dorothea Brooke is not a character in touch with what we might call “reality”: the ossified Casaubon’s powdery marriage proposal makes her swoon, and she soon finds herself wedded to an empty idea of intellectual apprenticeship. Yet Dorothea is fundamentally awake to the world, a creature whose “deep impressions” will eventually furnish the richly appointed inner life she was so desperate to inhabit as a girl. Through Dorothea, Eliot suggests an ideal of memory as the bedrock of human understanding—a home for the self—rather than an act of acquisitive personal recording, where new experiences form a novel backdrop for an ongoing picnic of self-celebration. She and Fowles, who was born and lived much of his life in provincial England, shared at least one conviction: true memory is a forest; remembering is just the trees.

 

‹ Prev