Book Read Free

We're Doomed. Now What?

Page 32

by Roy Scranton


  Everything dies, but what we do while we live lives on, in our sons and daughters, in the worlds we make or destroy. We’re all doomed. That’s simply the condition of being born. But it’s also the condition that makes a new future possible. Now what?

  What Is Thinking Good For?

  Spend a couple of hours online skimming think pieces and hot takes, bouncing from Slate to the New York Times to Twitter to The Chronicle of Higher Education to Reddit, and you’ll soon find yourself either nauseated by the vertigo that comes from drifting awash in endless waves of repetitive, clickbaity, amnesiac drek, or so benumbed and bedazzled by the sheer volume of ersatz cognition on display that you wind up giving in to the flow and welcoming your own stupefaction as a kind of relief.

  Nevertheless, one can, out there among the hired trolls, scum-slingers, professional identitarians, pundits, and charlatans, still reliably find good work: courageous investigative journalism; thoughtful, witty, and erudite reflections on complex cultural phenomena; heartfelt, perceptive essays exploring intricate ethical and social dilemmas, each piece the assiduously hewn product of countless hours of labor. And all of it eminently disposable, fated to be consumed and retweeted and referred to for a few hours then forgotten, like everything else passing through the self-devouring gullet of the ouroborosian media Leviathan we live within, picked up and dropped as we keep searching for newer confirmations of our half-articulated hopes and fears, more recent pictures to prove to us our world makes sense, fresher flags to wave telling other people who we are and that we exist. The internet’s total instrumentalization of thought, in which every shared #mustread is repurposed to accessorize an online persona, has created a constant demand for new content, a great vortex sucking every half-formed attitude and clever tweak into its gaping maw, which is also its anus, spewing and eating and spewing again in an interminable grotesque mockery of public intellectual life, Hegel’s world spirit as human centipede.

  Flee from the trashfire of the agora to the fluorescent-lit labyrinths of the ivory tower and you find yourself in another world, yet a recognizable one, in which the same faddishness and basic disposability holds sway as in the broader culture, but wrought in strange tongues and played out for much smaller audiences. Despite the vestigial guild system regulating entry into the academy, Sturgeon’s law—which proposes that 90% of everything is crap—still holds, and thought is as instrumentalized in the university as it is out in the marketplace: articles and books are mainly valued not for their wisdom, aesthetic qualities, rigor, or information they hold about the world, but rather as entries on a CV, points toward promotion, testimony to the university’s glory, and provocations for further “knowledge production.”

  On the humanities side, at least, since that is what I am familiar with, specifically literature and English, all too often what one finds are conversations driven by relentless self-absorption in debates about the field and relentless demands for “ground-breaking” research agendas, neologisms, and epistemic revolutions, since these are the easiest ways for new scholars to distinguish themselves and for older scholars to stay excited about their profession. Each new crop of PhD students confronts a brutally competitive job market in which the individual odds for making a good career out of their long and arduous training are slim: thus the intellectual freedom and sense of discovery which may have drawn them to graduate school in the first place are displaced by powerful compulsions to produce consumer objects à la mode, conventional enough to be recognizable, different enough to be interesting, shaped to the needs of the market, forgettable and ultimately forgotten. And yet, despite the neoliberalization of the university, the casualization of academic labor, the ridiculous workload most teachers carry, the catastrophe of the academic job market, the herd-thinking that characterizes academic fads, and the all too real assault on public education, higher education, and the humanities coming from anti-intellectual Republicans and Koch-funded libertarian cadres, good scholarship still somehow gets done. But who has the time to read it?

  Across the spectrum of cultural production, the same phenomenon holds: our attention to any particular object of human thought or passion is fleeting, for we know as soon as we start reading one article that six more wait to replace it, and more behind those, more books, more TV shows, more movies, more reviews, more lists. Our relationship with the objects of human intellectual and artistic production is no longer characterized by sustained attention and reflection, but rather swift and relentless consumption: absorb as much as you can in a skim, post or repost it on social media, move on. You’re wondering right now how long this piece is, thinking about skipping ahead, anxious if and when I’m going to get to a 280-character takeaway you can tweet to your followers. The important thing is not the object or its effect on us, but rather what stance or opinion the object makes possible. In this way, what was once recognized as thought, facilitated in literate culture by the circulation of texts, has become something else: on one side a shifting array of shallow reactions and poses, on the other a virtual hive mind. The individual thinker, like the individual artist, has been subsumed into the relentless and sterile abundance of consumer capitalism, having become in the process either a pop idol whose output is nothing but variations on a brand, or just another worker bee.

  The point here is not to lament the loss of the individual, nor to speak ill of pop idols, though the anthropological implications of the way that the internet and mass society are transforming our conception of the individual and evacuating that interiority which is the sine qua non of the literate self are profound and still not yet fully articulated, but rather to ask what it all means for thought. What does it mean today to think? What is thinking good for?

  It sometimes seems as if the only socially valued forms of intellectual labor are the production of ideology (“think tanks”), the production of attention (“think pieces”), and the production of reproducible consumer objects (i.e., books, not necessarily for reading but for discussing, ideally big books with simple arguments that can be repeated ad nauseam across multiple platforms—think Steven Pinker or Malcolm Gladwell). Producing knowledge about the world is still compensated, if not as well as producing opinions, but when it comes to serious thought the situation seems bleak. Yes, “critical thinking” is still spoken of as a value in the humanities wing of the educational industry, even though it’s under profound attack across the culture and even if critique has by and large devolved into a set of rote gestures, but the arduous liberation of consciousness from dogma and self-imposed ignorance is as unwelcome today as it was in Socrates’s Athens. And even if you do find solid journalism, beautiful writing, profound analysis, or edifying thought, what do you do with it? Read it, tweet it, then move on to the next, and the next, and the next? A stream of language passes into you through a screen then back out through another screen, and can you even say it touched you? Were you even there? Or was it just a momentary shudder of the hive?

  Now that we’re here, though, I see that I’ve begun in media res, or rather, in media media, begging the question before I even asked it. We cannot ask what thought means today, that is, until we have some sense of what we mean by “thought.” Is it the same as critique, in a sense Kantian or Marxian or otherwise? Is it the love of wisdom? Learning to die? The sedulous apprehension of universal abstract forms of truth? The elaboration of mental architecture justifying otherwise meaningless lives? A game?

  The answer’s not immediately clear, though we might make some intuitive claims. We can be certain that thought has something to do with consciousness, and something to do with language, though what exactly the relation is in each case is not entirely transparent. Thought has its pre-conscious insight, its structures and images, its relations and systems; half the difficulty in thinking is putting thought into words. Yet thought is not reducible to language, much less to text. The argument Socrates makes against writing in Plato’s Phaedrus—that writing isn’t properly philosophical bec
ause a text always says the same thing to whoever questions it—suggests a way of approaching the question that sees thought not in terms of its product, the philosophical argument, but as a process, a dialectic, and considered from this angle, perhaps my initial concern over our relentless inattention to textual objects is misplaced. Thought, after all, does not inhere in ink and paper, much less in pixels on screens, but in the human social world. Yet the Phaedrus offers a paradox, because it is itself a text, and as Derrida demonstrated in his famous essay on Plato’s pharmakon, it is a rather ambiguous and dialectical text—as might be, Derrida argues, every text, since any given piece of writing is not a dead letter transmitting self-evident truth but a medium, and a medium not only connecting the present reader and absent writer, but connecting the thinking writer and language, logos, both in the writer’s time and in the reader’s: thought as social media.

  Thinking is indubitably a social activity, though the archetypal image of the thinker as hermit would suggest otherwise. In the apartness of the eremite image, whether it takes the form of Socrates standing ruminating in his sandals, Siddhartha sitting under his lotus tree, Rodin’s bent thinker perched pondering the gates of hell, or Hannah Arendt smoking over her desk, we find something essential represented, which is that while thought is undeniably a social activity, it is not continuous with social life. Thought is something that happens in a strange relation to society, at some distance from day-to-day human rhythms, in a different kind of time, attentive to other cares. The thinker is a public figure, no doubt, since to keep one’s thoughts private, unarticulated, unwritten, and unshared is to abjure the dialectical process of thought itself, the difficult translation of ideas and phenomena into the very fabric of sociality, language.

  But there remains about the thinker something isolate, some quality that sets her apart. This quality is not accidental but essential, for it is that which separates the thinker from the ideologue, the preacher, the “thought-leader,” and the sophist: it is the quality of dedication to thought itself, a refusal to accept thought’s subordination to other social values, even to society as such. It is a willing and deliberate self-estrangement from “what is known,” the unexamined and taken-for-granted premises upon which the collective imaginary structures we live within are founded. It is an effort to ask hard and perhaps unanswerable questions about our most sacred beliefs and our most obvious truths. Which is why the thinker remains alienated from society even as she walks within it, and why so many people find her threatening and annoying.

  What is thought, then? Thought is the willed suspension of thought. It is what the ancients called “pondering,” what Zen Buddhists call shikantaza, or “just sitting,” what Adorno called “negative dialectics,” and what German philosopher Peter Sloterdijk calls the suspension of “stress-semantic chains.” Thought is the opposite of a hot take, which channels an emotional reaction through a pre-existing pattern of rhetoric back into the ebb and flow of social meaning. Thought is a practice which the philosophical text assists through its demand for rigorous attention to language, but which is, as Socrates argued in Plato’s Phaedrus, never reducible to the text. It cannot be replicated, reproduced, remediated, or retweeted because it’s an event in a moment of social relation, nothing more, nothing less: a pause, a suspension, a temporary liberation from the psychosomatic chains that bind us to the collective dream we call reality, an opening in which new possibilities might emerge.

  So what’s thinking good for, then? What’s it good for today, or, frankly, ever? Why should we take the time to ponder complex and difficult arguments about abstract concerns, suspend our emotional and moral reactions to confusing and provocative claims, and cultivate our alienation from the 24/7 cycle of mediated outrage and despair that shapes so much contemporary social life?

  Or let’s put the question another way. Over the past thirty years, hundreds of books and thousands of articles have been written about the urgent, catastrophic threat that climate change poses to the world, exploring the problem in its technical, historical, ethical, and philosophical aspects. These books and articles have been talked about in the mainstream media, even on TV, and nearly everyone who has access to the internet today knows about sea level rise, the greenhouse effect, and the melting Arctic. Yet in spite of all this intellectual work, all this research and rhetoric and effort and thought, we seem unable to act coherently and collectively to address this grave existential threat. Part of the problem is the “we” here, because that “we” includes almost two hundred sovereign nations, each with its own political and economic agendas, various corporate entities whose very existence depends on perpetuating the extractive fossil-fueled capitalist economy that’s killing us, and an elite group of rich and powerful decision makers who believe that they will be protected from the danger by their wealth, regard flagrant waste and conspicuous consumption as status symbols, and are deeply invested in business as usual even if it means global apocalypse.

  And climate change is only the most egregious example. Think about our inadequate gun laws, the appalling regularity of school shootings, and the innumerable think pieces, personal essays, and legislative proposals that have been impotently put forth addressing the problem. Think about the outrageous persistence of systemic and ideological racism, founded in a notion of human difference thoroughly discredited by science more than a century ago, fought against by brave people who risked their lives for basic dignity and justice, and argued against by some of America’s most brilliant thinkers. Think about the ongoing stupidity of our war in Afghanistan, the stubborn persistence of aggressive sexism, our soul-sucking addiction to our phones, the rise of Trumpism, the opioid epidemic, et cetera. If you take all the seemingly intractable ills of modern life and compare them against the vigorous, dedicated, earnest efforts made by countless talented and educated thinkers, writers, journalists, policy wonks, scholars, activists, students, and artists to address them, you might be forgiven if the conclusion you come to is one of puzzled despair: we seem incapable of listening to reason.

  In the late eighteenth century, German philosopher Immanuel Kant articulated an ideal of collective, self-conscious rational self-determination that remains one of the noblest achievements of human thought. In his famous essay “What Is Enlightenment?”, written in the years between the signing of the American Declaration of Independence and the ratification of the US Constitution, Kant argues that free thought leads to better government, writing: “Free thought gradually acts upon the mind of the people and they gradually become more capable of acting in freedom. Eventually, the government is also influenced by this free thought and there it treats man, who is now more than a machine, according to his dignity.” It’s a beautiful concept: the idea that we can all come together as equals and, through the free use of our reason and open discussion, not only decide what is best for us as a group and how to live together, but also achieve the highest fulfillment of our personal and collective existence. This concept undergirds our idea of the public sphere, what some call the marketplace of ideas, the value we ostensibly place on public education and higher education, our notions of citizenship, American civic religion, and the root of what we understand thinking—reason—writing—to be good for. Thinking is good, we tend to assume, because it helps us make life better. Thinking is good, we tend to believe, because it makes us free.

  Yet everywhere we look today, from Twitter to the White House to Raqqa to the melting Arctic, human reason stands defeated. Our seemingly rational decisions turn out to have fatal consequences we never anticipated, free and informed public discourse has given way to propaganda, lies, harassment, and censorship, and many free citizens of open democracies no longer see the value in making judgments based on evidence, but rather seek evidence only to confirm their pre-existing judgments, heedless of whether they are accurate or erroneous. Pizzagate? Pee tape? Russian hackers? Vaccines? Who knows? The only thing you can be sure of is that the other side is lying.
/>   Kant’s motto for the Enlightenment was Sapere Aude!: “Dare to Know!” The motto for twenty-first-century America seems to be taken from Weird Al Yankovic’s song, “Dare to Be Stupid.” And from a certain point of view, it’s not wholly irrational to choose to be irrational. After all, the Enlightenment agenda of rational control over the human and non-human world is exactly what has led to most of our modern ills, from the hectic, Taylorized grind of our spiritually empty lives, relieved only by binge-watching Netflix, to our dependence on pills to stave off depression, from climate change, mass extinction, and an increasingly toxic and degraded environment to nuclear war. What’s more, as Friedrich Nietzsche realized in the late nineteenth century, radical free thought leads ultimately not to an increase in human dignity, as Kant argued, but to nihilism and apocalyptic violence. Philosophy has struggled from the beginning to make sense of the Enlightenment’s internal contradictions, and those contradictions have only grown. In the words of Theodor Adorno and Max Horkheimer, written during World War II but holding just as true today as they did then, “The wholly enlightened earth is radiant with triumphant calamity.”

  Alas, the Enlightenment’s humanistic ideal was a delusion from the beginning, premised on a deliberate misapprehension of empiricism that built a giant loophole into what otherwise seems to be a wholly deterministic universe. The great problem empiricism poses, articulated variously by Hume, Spinoza, and La Mettrie, is that once you accept the underlying premise that the universe in which we live operates by predictable mechanisms (or “laws”) which can be quantified and modeled, you lose any basis for free will. A truly deterministic universe is mechanistic turtles all the way down, and there’s no justified exception for human consciousness. The best one can do with any kind of intellectual integrity is maintain a faith in generative chaos, or bracket the question of whether free will actually exists at all and assert that we should believe in it regardless, neither of which—generative chaos or a pragmatist will to believe—comes close, you’ll notice, to anything like our actual experience of freedom, which when you pay attention to it looks less like reason and more like rationalization. We act, and only after come up with the reasons why. Kant, following Descartes, dispensed with the problem by a clever bit of sophistry and an abiding faith in a Christian God. Reason offered Kant an escape from determinism because he held that reason comes from God, connects to God, in some way is God. The Enlightenment wasn’t so much the triumph of reason as the subordination of scientific empiricism to Christian metaphysics.

 

‹ Prev