by David Lehman
Thank you for downloading this Scribner eBook.
* * *
Join our mailing list and get updates on new releases, deals, bonus content and other great books from Scribner and Simon & Schuster.
CLICK HERE TO SIGN UP
or visit us online to sign up at
eBookNews.SimonandSchuster.com
CONTENTS
* * *
Foreword by David Lehman
Introduction by Terrance Hayes
Sherman Alexie, “Sonnet, with Pride”
Rae Armantrout, “Control”
John Ashbery, “Breezeway”
Erin Belieu, “With Birds”
Linda Bierds, “On Reflection”
Traci Brimhall, “To Survive the Revolution”
Lucie Brock-Broido, “Bird, Singing”
Jericho Brown, “Host”
Kurt Brown, “Pan del Muerto”
CAConrad, “wondering about our demise while driving to Disneyland with abandon”
Anne Carson, “A Fragment of Ibykos Translated 6 Ways”
Joseph Ceravolo, “Hidden Bird”
Henri Cole, “City Horse”
Michael Earl Craig, “The Helmet”
Philip Dacey, “Juilliard Cento Sonnet”
Olena Kalytiak Davis, “It Is to Have or Nothing”
Kwame Dawes, “News from Harlem”
Joel Dias-Porter, “Elegy Indigo”
Natalie Diaz, “These Hands, if Not Gods”
Mark Doty, “Deep Lane”
Sean Thomas Dougherty, “The Blues Is a Verb”
Rita Dove, “The Spring Cricket Repudiates His Parable of Negritude”
Camille Dungy, “Conspiracy (to breathe together)”
Cornelius Eady, “Overturned”
Vievee Francis, “Fallen”
Ross Gay, “To the Fig Tree on 9th and Christian”
Eugene Gloria, “Liner Notes for Monk”
Ray Gonzalez, “One El Paso, Two El Paso”
Kathleen Graber, “The River Twice”
Rosemary Griggs, “SCRIPT POEM”
Adam Hammer, “As Like”
Bob Hicok, “Blue prints”
Le Hinton, “No Doubt About It (I Gotta Get Another Hat)”
Tony Hoagland, “Write Whiter”
Major Jackson, “OK Cupid”
Amaud Jamaul Johnson, “L.A. Police Chief Daryl Gates Dead at 83”
Douglas Kearney, “The Labor of Stagger Lee: Boar”
Yusef Komunyakaa, “Negritude”
Hailey Leithauser, “In My Last Past Life”
Larry Levis, “Elegy with a Darkening Trapeze inside It”
Gary Copeland Lilley, “Sermon of the Dreadnaught”
Frannie Lindsay, “Elegy for My Mother”
Patricia Lockwood, “Rape Joke”
Nathaniel Mackey, “Oldtime Ending”
Cate Marvin, “An Etiquette for Eyes”
Jamaal May, “Masticated Light”
Shara McCallum, “Parasol”
Marty McConnell, “vivisection (you’re going to break my heart)”
Valzhyna Mort, “Sylt I”
Harryette Mullen, “Selection from Tanka Diary”
Eileen Myles, “Paint Me a Penis”
D. Nurkse, “Release from Stella Maris”
Sharon Olds, “Stanley Kunitz Ode”
Gregory Pardlo, “Wishing Well”
Kiki Petrosino, “Story Problem”
D. A. Powell, “See You Later.”
Roger Reeves, “The Field Museum”
Donald Revell, “To Shakespeare”
Patrick Rosal, “You Cannot Go to the God You Love with Your Two Legs”
Mary Ruefle, “Saga”
Jon Sands, “Decoded”
Steve Scafidi, “Thank You Lord for the Dark Ablaze”
Frederick Seidel, “To Philip Roth, for His Eightieth”
Diane Seuss, “Free Beer”
Sandra Simonds, “I Grade Online Humanities Tests”
Jane Springer, “Forties War Widows, Stolen Grain”
Corey Van Landingham, “During the Autopsy”
Afaa Michael Weaver, “Passing Through Indian Territory”
Eleanor Wilner, “Sowing”
David Wojahn, “My Father’s Soul Departing”
Greg Wrenn, “Detainment”
Robert Wrigley, “Blessed Are”
Jake Adam York, “Calendar Days”
Dean Young, “Emerald Spider Between Rose Thorns”
Rachel Zucker, “Mindful”
Contributors’ Notes and Comments
Magazines Where the Poems Were First Published
Acknowledgments
About Terrance Hayes and David Lehman
David Lehman was born in New York City. Educated at Stuyvesant High School and Columbia University, he spent two years as a Kellett Fellow at Clare College, Cambridge, and worked as Lionel Trilling’s research assistant upon his return from England. He is the author of nine books of poetry, including New and Selected Poems (2013), Yeshiva Boys (2009), When a Woman Loves a Man (2005), The Daily Mirror (2000), and Valentine Place (1996), all from Scribner. He is the editor of The Oxford Book of American Poetry (Oxford, 2006) and Great American Prose Poems: From Poe to the Present (Scribner, 2003), among other collections. A Fine Romance: Jewish Songwriters, American Songs (Nextbook/Schocken), the most recent of his six nonfiction books, won the Deems Taylor Award from the American Society of Composers, Authors, and Publishers (ASCAP) in 2010. Among Lehman’s other books are a study in detective novels (The Perfect Murder), a group portrait of the New York School of poets (The Last Avant-Garde), and an account of the scandal sparked by the revelation that a Yale University eminence had written for a Nazi-controlled newspaper in his native Belgium (Signs of the Times: Deconstruction and the Fall of Paul de Man). He teaches in the graduate writing program of The New School and lives in New York City and in Ithaca, New York.
FOREWORD
* * *
by David Lehman
Maybe I dreamed it. Don Draper sat sipping Canadian Club from a coffee mug on Craig Ferguson’s late-night talk show. “Are you on Twitter?” the host asks. “No,” Draper says. “I don’t”—and here he pauses before pronouncing the distasteful verb—“tweet.” Next question. “Do you read a lot of poetry?” The ad agency’s creative director looks skeptical. Though the hero of Mad Men is seen reading Dante’s Inferno in one season of Matthew Weiner’s show and heard reciting Frank O’Hara in another, the question seems to come from left field. “Poetry isn’t really celebrated any more in our culture,” Don says, to which the other retorts, “It can be—if you can write in units of 140 keystrokes.” Commercial break.
The laugh line reveals a shrewd insight into the subject of “poetry in the digital age,” a panel-discussion perennial. The panelists agree that text messaging and Internet blogs will be seen to have exercised some sort of influence on the practice of poetry, whether on the method of composition or on the style and surface of the writing. And surely we may expect the same of a wildly popular social medium with a formal requirement as stringent as the 140-character limit. (To someone with a streak of mathematical mysticism, the relation of that number to the number of lines in a sonnet is a thing of beauty.) What Twitter offers is ultimate immediacy expressed with ultimate concision. “Whatever else Twitter is, it’s a literary form,” says the novelist Kathryn Schulz, who explains how easy it was for her to get addicted to “a genre in which you try to say an informative thing in an interesting way while abiding by its constraint (those famous 140 characters). For people who love that kind of challenge—and it’s easy to see why writers might be overrepresented among them—Twitter has the
same allure as gaming.” True, the hard-to-shake habit caused its share of problems. Schulz reports a huge “distractibility increase” and other disturbing symptoms: “I have felt my mind get divided into tweet-size chunks.” Nevertheless there is a reason that she got hooked on this “wide-ranging, intellectually stimulating, big-hearted, super fun” activity.1 When, in an early episode of the Netflix production of House of Cards, one Washington journalist disparages a rival as a “Twitter twat,” you know the word has arrived, and the language itself has changed to accommodate it. There are new terms (“hashtag”), acronyms (“ikr” in Detroit means “I know right?”), shorthand (“suttin” is “something” in Boston).2 Television producers love it (“Keep those tweets coming!”). So does Wall Street: when Twitter went public in 2013, the IPO came off without a hitch, and the stock climbed with the velocity of an over-caffeinated momentum investor eager to turn a quick profit.
The desire to make a friend of the new technology is understandable, though it obliges us to overlook some major flaws: the Internet is hell on lining, spacing, italics; line breaks and indentation are often obscured in electronic transmission. The integrity of the poetic line can be a serious casualty. Still, it is fruitless to quarrel with the actuality of change, and difficult to resist it profitably—except, perhaps, in private, where we may revel in our physical books and even, if we like, write with pen or pencil on graph paper or type our thoughts with the Smith-Corona manual to which we have a sentimental attachment. One room in the fine “Drawn to Language” exhibit at the University of Southern California’s Fisher Art Museum in September 2013 was devoted to Susan Silton’s site-specific installation of a circle of tables on which sat ten manual typewriters of different makes, models, sizes, and decades. It was moving to behold the machines not only as objects of nostalgia in an attractive arrangement but as metonymies of the experience of writing in the twentieth century—and as invitations to sit down and hunt and peck away to your heart’s content. Seeing the typewriters in that room I felt as I do when the talk touches on the acquisition of an author’s papers by a university library. It’s odd to be a member of the last generation to have “papers” in this archival and material sense. Odd for an era to slip into a museum while you watch.
You may say—I have heard the argument—that the one-minute poem is not far off. Twitter’s 140-keystroke constraint—together with the value placed on being “up to speed”—brings the clock into the game. Poetry, a byte-size kind of poetry, has been, or soon will be, a benefit of attention deficit disorder. (This statement, or prediction, is not necessarily or not always made in disparagement.) Unlike the telephone, the instruments of social media rely on the written, not the spoken word, and it will be interesting to see what happens when the values of hip-hop lyricists and spoken-word poets, for whom the performative aspects of the art are paramount, tangle with the values of concision, bite, and wit consistent with the rules of the Twitter feed. On the other hand, it is conceivable that the sentence I have just composed will be, for all intents and purposes, anachronistic in a couple of years or less. Among my favorite oxymorons is “ancient computer,” applied to my own desktop.3
* * *
In his famous and famously controversial Rede Lecture at Cambridge University in 1959, the English novelist C. P. Snow addressed the widening chasm between the two dominant strains in our culture.4 There were the humanists on the one side. On the other were the scientists and applied scientists, the agents of technological change. And “a gulf of mutual incomprehension” separated them. Though Snow endeavored to appear evenhanded, it became apparent that he favored the sciences—he opted, in his terms, for the fact rather than the myth. The scientists “have the future in their bones”—a future that will nourish the hungry, clothe the masses, reduce the risk of infant mortality, cure ailments, and prolong life. And “the traditional culture responds by wishing the future did not exist.”
The Rede Lecture came in the wake of the scare set off by the Soviet Union’s launch of Sputnik in October 1957. There was widespread fear that we in the West, and particularly we in the United States, were in danger of falling behind the Russians in the race for space, itself a metaphor for the scientific control of the future. For this reason among others, Snow’s lecture was extraordinarily successful. Introducing a phrase into common parlance, “The Two Cultures” reached great numbers of readers and helped shape a climate friendly to science at the expense of the traditional components of a liberal education. Much in that lecture infuriated the folks on the humanist side of the divide.5 Snow wrote as though humanistic values were possible without humanistic studies. In literature he saw not a corrective or a criticism of life but a threat. He interpreted George Orwell’s 1984 as “the strongest possible wish that the future should not exist” rather than as a warning against the authoritarian impulses of the modern state coupled with its sophistication of surveillance. Snow founded his argument on the unexamined assumption that scientists, in thrall to the truth, can be counted on to do the right thing—an assumption that the history of munitions would explode even if we could all agree on what “the right thing” is. For Snow, who had been knighted and would be granted a life peerage, the future was bound to be an improvement on the past, and the change would be entirely attributable to the people in the white coats in the laboratory. Generalizing from the reactionary political tendencies of certain famous modern writers, Snow floated the suggestion that they—and by implication those who read them—managed to “bring Auschwitz that much nearer.” Looking back at the Rede Lecture five years later, Snow saw no reason to modify the view that intellectuals were natural Luddites, prone to “talk about a pre-Industrial Eden” that never was. They ignored the simple truth that the historian J. H. Plumb stated: “No one in his senses would choose to have been born in a previous age unless he could be certain that he would have been born into a prosperous family, that he would have enjoyed extremely good health, and that he could have accepted stoically the death of the majority of his children.” In short, according to Snow, the humanists were content to dwell in a “pretty-pretty past.”
In 1962 F. R. Leavis, then perhaps the most influential literary critic at Cambridge, denounced Snow’s thesis with such vitriol and contempt that he may have done the humanist side more harm than good. “Snow exposes complacently a complete ignorance,” Leavis said in the Richmond Lecture, and “is as intellectually undistinguished as it is possible to be.” Yet, Leavis added, Snow writes in a “tone of which one can say that, while only genius could justify it, one cannot readily think of genius adopting it.”6 Reread today, the Richmond Lecture may be a classic of invective inviting close study. As rhetoric it was devastating. But as a document in a conflict of ideas, the Richmond Lecture left much to be desired. Leavis did not adequately address the charges that Snow leveled at literature and the arts on social and moral grounds.7 The scandal in personalities, the shrillness of tone, eclipsed the subject of the debate, which got fought out in the letters column of the literary press and was all the talk in the senior common rooms and faculty lounges of the English-speaking world.
The controversy ignited by a pair of dueling lectures at Cambridge deserves another look now not only because fifty years have passed and we can better judge what has happened in the intervening period but because more than ever the humanities today stand in need of defense. In universities and liberal arts colleges, these are hard times for the study of ideas. In 2013, front page articles in The New York Times and The Wall Street Journal screamed about the crisis in higher education especially in humanist fields: shrinking enrollments at liberal arts colleges; the shutting down of entire college departments; the elimination of courses and requirements once considered vital. The host of “worrisome long-trends” included “a national decline in the number of graduating high-school seniors, a swarm of technologies driving down costs and profit margins, rising student debt, a soft job market for college graduates and stagnant household incomes.”8 Is that all? N
o, and it isn’t everything. There has also been a spate of op-ed columns suggesting that students would be wise to save their money, study something that can lead to gainful employment, and forget about majoring in modern dance, art history, philosophy, sociology, theology, or English unless they are independently wealthy.
The cornerstones of the humanities, English and history, have taken a beating. At Yale, English was the most popular major in 1972–73. It did not make the top five in 2012–13. Twenty-one years ago, 216 Yale undergraduates majored in history; less than half that number picked the field last year.9 Harvard—where English majors dwindled from 36 percent of the student body in 1954 to 20 percent in 2012—has issued a report on the precipitous drop. Russell A. Berman of Stanford, in a piece in The Chronicle of Higher Education ominously entitled “Humanist: Heal Thyself,” observed that “the marginalization of the great works of the erstwhile canon has impoverished the humanities,” and that the Harvard report came to this important conclusion. But he noted, too, that it stopped short of calling for a great-books list of required readings. My heart sinks when I read such a piece and arrive at a paragraph in which the topic sentence is, “Clearly majoring in the humanities has long been an anomaly for American undergraduates.”10 Or is such a sentence—constructed as if to sound value-neutral and judgment-free in the proper scientific manner—part of the problem? The ability of an educated populace to read critically, to write clearly, to think coherently, and to retain knowledge—even the ability to grasp the basic rules of grammar and diction—seems to be declining at a pace consonant with the rise of the Internet search engine and the autocorrect function in computer programs.
Not merely the cost but the value of a liberal arts education has come into doubt. The humanists find themselves in a bind. Consider the plight of the English department. “The folly of studying, say, English Lit has become something of an Internet cliché—the stuff of sneering ‘Worst Majors’ listicles that seem always to be sponsored by personal-finance websites,” Thomas Frank writes in Harper’s.11 There is a new philistinism afoot, and the daunting price tag of college or graduate education adds an extra wrinkle to an argument of ferocious intensity. “The study of literature has traditionally been felt to have a unique effectiveness in opening the mind and illuminating it, in purging the mind of prejudices and received ideas, in making the mind free and active,” Lionel Trilling wrote at the time of the Leavis–Snow controversy. “The classic defense of literary study holds that, from the effect which the study of literature has upon the private sentiments of a student, there results, or can be made to result, an improvement in the intelligence, and especially the intelligence as it touches the moral life.”12 It is vastly more difficult today to mount such a defense after three or more decades of sustained assault on canons of judgment, the idea of greatness, the related idea of genius, and the whole vast cavalcade of Western civilization.13 Heather Mac Donald writes more in sorrow than in anger that the once-proud English department at UCLA—which even lately could boast of being home to more undergraduate majors than any other department in the nation—has dismantled its core, doing away with the formerly obligatory four courses in Chaucer, Shakespeare, and Milton. You can now satisfy the requirements of an English major with “alternative rubrics of gender, sexuality, race, and class.” The coup, as Mac Donald terms it, took place in 2011 and is but one event in a pattern of academic changes that would replace a theory of education based on a “constant, sophisticated dialogue between past and present” with a consumer mind-set based on “narcissism, an obsession with victimhood, and a relentless determination to reduce the stunning complexity of the past to the shallow categories of identity and class politics. Sitting atop an entire civilization of aesthetic wonders, the contemporary academic wants only to study oppression, preferably his or her own, defined reductively according to gonads and melanin.”14