But What If We're Wrong?
Page 3
Here again, I’d like to imagine that Saunders will be rewarded for his self-deprecation, in the same way I want him to be rewarded for his sheer comedic talent. But I suspect our future reality won’t be dictated by either of those qualities. I suspect it will be controlled by the evolving, circuitous criteria for what is supposed to matter about anything. When trying to project which contemporary books will still be relevant once our current population has crumbled into carbon dust and bone fragments, it’s hopeless to start by thinking about the quality of the works themselves. Quality will matter at the end of the argument, but not at the beginning. At the beginning, the main thing that matters is what that future world will be like. From there, you work in reverse.
[2]“All I can tell you is that in 100 years I seriously doubt that the list of the 100 best writers from our time is going to be as white, as male, as straight, as monocultural as the lists we currently produce about the 100 best writers of our time.” This an e-mail from Junot Díaz, the Dominican-American novelist who won a Pulitzer Prize in 2008 and a MacArthur Fellowship in 2012. “In all frankness, our present-day evaluative criteria are so unfairly weighted towards whiteness, maleness, middle-class-ness, straightness, monoculturality—so rotted through with white supremacy—as to be utterly useless for really seeing or understanding what’s going on in the field, given how little we really see and value of the art we’re now producing because of our hegemonic scotoma. Who can doubt that the future will improve on that? No question that today, in the margins of what is considered Real Literature, there are unacknowledged Kafkas toiling away who are more likely women, colored, queer and poor.”
Díaz is a bombastic intellectual with a limitless career (his debut novel about an overweight weirdo, The Brief Wondrous Life of Oscar Wao, was named the best book of the twenty-first century by a panel of critics commissioned by the BBC). It’s unsurprising that this is how he views society, and his argument is essentially bulletproof. It’s a worldview that’s continually gaining traction: You can’t have a macro discussion about literary canons without touching on these specific points. When The New York Times released its 2014 “100 Notable Books” list, several readers noticed how there were exactly twenty-five fiction books by men, twenty-five fiction books by women, twenty-five nonfiction books by men, and twenty-five nonfiction books by women. Do I have a problem with this? I have no problem with this. But it does reflect something telling about the modern criteria for quantifying art: Symmetrical representation sits at the center of the process. It’s an aesthetic priority. Granted, we’re dealing with a meaningless abstraction, anyway—the list is called “notable” (as opposed to “best”), it’s politicized by the relationships certain authors have with the list makers, it annually highlights books that instantly prove ephemeral, and the true value of inclusion isn’t clear to anyone. Yet in the increasingly collapsible, eternally insular idiom of publishing, the Times’ “100 Notable” list remains the most visible American standard for collective critical appreciation. This is why the perfect 25:25:25:25 gender split is significant. Does it not seem possible—in fact, probable—that (say) twenty-six of the most notable novels were written by women? Or that perhaps men wrote twenty-seven of the most notable nonfiction works?6 I suppose it’s mathematically possible that an objective, gender-blind judging panel might look at every book released in 2014 and arrive at the same conclusion as The New York Times. Perfect statistical symmetry is within the realm of possibility. But no impartial person believes that this is what happened. Every rational person knows this symmetry was conscious, and that this specific result either (a) slightly invalidates the tangible value of the list, or (b) slightly elevates the intangible value of the list. (I suppose it’s also possible to hold both of those thoughts simultaneously.) In either case, one thing is absolutely clear: This is the direction in which canonical thinking is drifting. Díaz’s view, which once felt like an alternative perspective, is becoming the entrenched perspective. And when that happens, certain critical conclusions will no longer be possible.
Let’s assume that—in the year 2112—someone is looking back at the turn of the twenty-first century, trying to deduce the era’s most significant writers. Let us also assume Díaz’s opinion about the present culture has metabolized into the standard view; let’s concede that people of the future take for granted that the old evaluative criteria were “unfairly weighted towards whiteness, maleness, middle-class-ness, straightness, [and] monoculturality.” When that evolution transpires, here’s the one critical conclusion that cannot (and will not) happen: “You know, I’ve looked at all the candidates, consciously considering all genders and races and income brackets. I’ve tried to use a methodology that does not privilege the dominant class in any context. But you know what? It turns out that Pynchon, DeLillo, and Franzen were the best. The fact that they were white and male and straight is just coincidental.” If you prioritize cultural multiplicity above all other factors, you can’t make the very peak of the pyramid a reactionary exception, even in the unlikely event that this is what you believe (since such a conclusion would undoubtedly be shaped by social forces you might not recognize). Even more remote is the possibility that the sheer commercial force of a period’s most successful writers—in the case of our period, Stephen King and J. K. Rowling—will be viewed as an argument in their historical favor. If you accept that the commercial market was artificially unlevel, colossal success only damages their case.
This is not a criticism of identity politics (even though I know it will be taken that way), nor is it some attempt at diminishing the work of new writers who don’t culturally resemble the old writers (because all writing is subjective and all writers are subjectively valid). I’m not saying this progression is unfair, or that the new version of unfairness is remotely equivalent to the old version of unfairness. Such processes are never fair, ever, under any circumstances. This is just realpolitik reality: The reason something becomes retrospectively significant in a far-flung future is detached from the reason it was significant at the time of its creation—and that’s almost always due to a recalibration of social ideologies that future generations will accept as normative. With books, these kinds of ideological transfers are difficult to anticipate, especially since there are over two million books published in any given year. But it’s a little easier to conjecture how this might unspool in the smaller, more contained idiom of film. Take a movie like The Matrix: When The Matrix debuted in 1999, it was a huge box-office success. It was also well received by critics, most of whom focused on one of two qualities—the technological (it mainstreamed the digital technique of three-dimensional “bullet time,” where the on-screen action would freeze while the camera continued to revolve around the participants) or the philosophical (it served as a trippy entry point for the notion that we already live in a simulated world, directly quoting philosopher Jean Baudrillard’s 1981 reality-rejecting book Simulacra and Simulation). If you talk about The Matrix right now, these are still the two things you likely discuss. But what will still be interesting about this film once the technology becomes ancient and the philosophy becomes standard? I suspect it might be this: The Matrix was written and directed by “the Wachowski siblings.” In 1999, this designation meant two brothers; as I write today, it means two sisters. In the years following the release of The Matrix, the older Wachowski (Larry, now Lana) completed her transition from male to female. The younger Wachowski (Andy, now Lilly) publicly announced her transition in the spring of 2016. These events occurred during a period when the social view of transgender issues radically evolved, more rapidly than any other component of modern society. In 1999, it was almost impossible to find any example of a trans person within any realm of popular culture; by 2014, a TV series devoted exclusively to the notion won the Golden Globe for Best Television Series. In the fifteen-year window from 1999 to 2014, no aspect of interpersonal civilization changed more, to the point where Caitlyn (formerly Bruce) Jenner attracted more Twitt
er followers than the president (and the importance of this shift will amplify as the decades pass—soon, the notion of a transgender US president will not seem remotely implausible). So think how this might alter the memory of The Matrix: In some protracted reality, film historians will reinvestigate an extremely commercial action movie made by people who (unbeknownst to the audience) would eventually transition from male to female. Suddenly, the symbolic meaning of a universe with two worlds—one false and constructed, the other genuine and hidden—takes on an entirely new meaning. The idea of a character choosing between swallowing a blue pill that allows him to remain a false placeholder and a red pill that forces him to confront who he truly is becomes a much different metaphor. Considered from this speculative vantage point, The Matrix may seem like a breakthrough of a far different kind. It would feel more reflective than entertaining, which is precisely why certain things get remembered while certain others get lost.
This is how the present must be considered whenever we try to think about it as the past: It must be analyzed through the values of a future that’s unwritten. Before we can argue that something we currently appreciate deserves inclusion in the world of tomorrow, we must build that future world within our mind. This is not easy (even with drugs). But it’s not even the hardest part. The hardest part is accepting that we’re building something with parts that don’t yet exist.
[3]Historical wrongness is more profound than simply hitting the wrong target. If we project that the writer who will be most remembered is “Person X,” but it actually turns out to be his more formally inventive peer “Person Y” . . . well, that barely qualifies as wrong. That’s like ordering a Budweiser and getting a Coors. It’s fun to argue over which contemporary juggernaut will eventually become a freestanding monolith, because that dispute is really just a reframing of every preexisting argument over whose commercial work is worthy of attention. It’s a hypothetical grounded in actuality. But there are different possibilities that are harder to parse. There are stranger—yet still plausible—outcomes that require an ability to reject the deceptively sensible. What if the greatest writer of this generation is someone who will die totally unknown? Or—stranger still—what if the greatest writer of this generation is a known figure, but a figure taken seriously by no one alive (including, perhaps, the writer in question)?
[4]Before explaining how and why these things might happen, I must recognize the dissenting opinion, particularly since my opinion is nowhere near normal. I do this by quoting novelist Jonathan Lethem as he casually quotes someone else from memory: “W. Somerset Maugham had a rather dry remark somewhere, which I won’t look up, but instead paraphrase: ‘Literary posterity may often surprise us in its selections, but it almost exclusively selects7 from among those known in their day, not the unknown.’ And I do think that’s basically true.”
Lethem is a prolific writer of fiction and criticism, as well as the unofficial curator and public advocate for the catalog of Philip K. Dick (a sci-fi writer who embodies the possibility of seeming more consequential in retrospect than he did as an active artist). Somewhat surprisingly, Lethem’s thoughts on my premise skew conservative; he seemed intrigued by the possibility, but unable to ignore the (seemingly) more plausible probability that the future will reliably reflect some version of the present. I’ve focused on Melville, and Díaz referenced Franz Kafka. But Lethem views both of those examples as high-profile exceptions that inadvertently prove the rule.
“Kafka and Melville are both really weird cases, unlikely to be repeated,” Lethem explains. “And it’s worth being clear that Melville wasn’t some self-published marginal crank. He was a bestselling writer, widely reviewed and acknowledged, up to the point where he began to diverge from the reading taste of his time. What’s weird is that all his greatest work came after he fell out of fashion, and also that there was such a strong dip in his reputation that he was barely remembered for a while . . . Kafka was conversant with a sophisticated literary conversation, and had, despite the strongly self-defeating tendencies to neither finish nor publish his writings, the attention of various alert colleagues. If he’d lived longer, he might very likely have become a prominent writer . . . The most canonical figure in literary history who was essentially a self-published kook would arguably be William Blake.”
The arc of Lethem’s larger contention boils down to two points. The first is that no one is really remembered over the long haul, beyond a few totemic figures—Joyce, Shakespeare, Homer—and that these figures serve as placeholders for the muddled generalization of greatness (“Time is a motherfucker and it’s coming for all of us,” Lethem notes). The second is that—even if we accept the possibility that there is a literary canon—we’re really discussing multiple canons and multiple posterities. We are discussing what Lethem calls “rival claims”: in essence, the idea that the only reason we need a canon is so that other people can disagree with it. The work of the writers who get included becomes almost secondary, since they now exist only for the purposes of contradiction.
“Let me try to generate an example of a very slapdash guess about the present situation,” Lethem writes me in an e-mail (and since it’s an especially interesting e-mail, I’m going to leave in his unorthodox parentheses and capitalizations). “The VERY most famous novelists alive (or just-dead) right now might be destined to be thought about for a good long time. Even if little read. You could see Wallace-Franzen-King as having some probable long-term viability, in the sense that when we talk about the French novel from a certain period, everyone’s sure to know that Stendhal-Balzac-and-Victor-Hugo existed (and yes, I do intend the comparison of the reputations of the six people in those two categories, in the order I put them in). But how many people do you know who have read them, apart from a school assignment of Balzac, possibly? [So] here’s where I get back to ‘rival claims’—for everyone who nods their heads solemnly at the idea that French literature of the not-too-medieval-past consists of those guys, there’ll be some wise guy who’ll say: ‘Fuck those boring novelists, the action in Paris at that time was Baudelaire and Verlaine!’ Or someone else who’ll say, ‘Did you know that Anatole France outsold all of those guys, and was pretty amazing, even if we don’t read him anymore?’ (Which might be like saying, ‘Jane Smiley was the key American novelist of the turn of the millennium.’) And someone else who’ll say, ‘I’m much more interested in Guy de Maupassant’ (which might be comparable to advancing a claim for, I dunno, George Saunders or Lorrie Moore). Meanwhile, we live in a time where the numbers of creators of literature has just exploded—and that plenitude is the field, and the context for the tiny, tiny number of things that get celebrated in the present, let alone recalled ten or twenty years later, let alone by the 22nd century. And it is all absolutely without any mercy destined to evaporate into the memory hole—irretrievably.”
Now, I’m not sure if Lethem’s final claim here is daring or anodyne. I certainly understand the mentality behind forwarding the possibility that nothing from this era will be remembered, simply due to volume. There are also those who contend we no longer need to “remember” anything at all, since the Internet has unlimited storage and ebooks never go out of print (and that there’s no longer any point in classifying any one creative person as more consequential than another, since we’ll all have equal and immediate access to both of their catalogs). Both thoughts share a curious combination of optimism, pessimism, and pragmatism. But they both overlook something else: human nature. Society enjoys this process, even if the job is superfluous and the field is too large to manage. Practicality is not part of the strategy. People will always look backward in an attempt to re-remember what they want to be true, just as I currently look ahead in an attempt to anticipate how that reverse engineering will work. Certain things will not evaporate, even if they deserve to.
I try to be rational (or at least my imaginary facsimile of what rationality is supposed to be). I try to look at the available data objectively (f
ully aware that this is impossible). I try to extrapolate what may be happening now into what will be happening later. And this, of course, is where naïve realism punches me in the throat. There’s simply no way around the limited ceiling of my own mind. It’s flat-out impossible to speculate on the future without (a) consciously focusing on the most obvious aspects of what we already know, and (b) unconsciously excluding all the things we don’t have the intellectual potential to grasp. I can’t describe what will happen in one hundred years if my central thesis insists that the best guess is always the worst guess. I can’t reasonably argue that the most important writer of this era is (for example) a yet-to-be-identified Irish-Asian skoliosexual from Juárez, Mexico, who writes brilliantly about migrant cannibalism from an anti-union perspective. That’s not even an argument, really. It’s just a series of adjectives. It’s a Mad Lib. I can’t list every possible variety of person who might emerge from the ether and eventually become mega-important, based on the premise that the best answer to this question must be whatever answer no one else has ever conceived. That would be insane.
Yet that insanity is (probably) closer to what will transpire. For an assortment of reasons, I suspect that whoever gets arbitrarily selected to represent turn-of-the-twenty-first-century literary greatness is—at the moment—either totally unknown or widely disrespected.