Out of Our Minds

Home > Other > Out of Our Minds > Page 46
Out of Our Minds Page 46

by Felipe Fernandez-Armesto


  For anyone who wanted to go on believing in the tottering or fallen idols of the past – progress, reason, certainty – the 1930s were a bad time. The Western world, where such beliefs had once seemed sensible, lurched through apparently random, unpredictable crises: crash, slump, Dust Bowl, social violence, mounting crime, the menace of recurrent war, and, above all, perhaps, the conflict of irreconcilable ideologies that fought each other into extinction.

  With the Second World War, ideas that were already slithering out of touch became untenable. It dwarfed the destructiveness of every earlier war. Bombs incinerated huge cities. Ideological and racial bloodlust provoked deliberate massacres. The deaths mounted to over thirty million. Industry produced machines for mass killing. Science twisted into racial pseudo-science. Evolution evolved into a justification for winnowing the weak and unwanted. Old ideals morphed murderously. Progress now took the form of racial hygiene; utopia was a paradise from which enemies were gutted and flung; nationalism turned into a pretext for sanctifying hatred and vindicating war; socialism turned, too, like a mangle for squeezing and crushing individuals.

  Nazis, who blamed Jews for the ills of society, set out calculatingly to get rid of them, herding them into death camps, driving them into sealed rooms, and gassing them to death. Pointless cruelty accompanied the Holocaust: millions enslaved, starved, and tortured in so-called scientific experiments. War or fear ignited hatred and numbed compassion. Scientists and physicians in Germany and Japan experimented on human guinea pigs to discover more efficient methods of killing. The atrocities showed that the most civilized societies, the best educated populations, and the most highly disciplined armies were unimmunized against barbarism. No case of genocide quite matched the Nazi campaign against the Jews, but that was not for want of other attempts. Experience of the Nazi death camps was too horrific for art or language to convey, though one gets, perhaps, a faint sense of the evil from photographs that show death-camp guards heaping up brutalized, emaciated corpses in the last weeks of the war: it was a desperate attempt to exterminate the survivors and destroy the evidence before the Allies arrived. They dismantled the incinerators, leaving starved, typhus-ridden cadavers to litter the ground or rot in shallow graves. Primo Levi, author of one of the most vivid memoirs, tried to encode memories of mass murder in sketches of individual suffering – of a woman, for instance, ‘like a frog in winter, with no hair, no name, eyes empty, cold womb’. He begged readers to carve his images ‘in your hearts, at home, in the street, going to bed, rising. Repeat them to your children.’12

  Governments and institutions of public education joined the struggle to keep the memory of the Holocaust and other atrocities alive. We know how deficient human memories are (see here), except, perhaps, in being adept at forgetting. The strange psychological quirk known as ‘Holocaust denial’ became widespread in the late-twentieth-century West: a refusal to accept the rationally incontrovertible evidence of the scale of Nazi evil. Many European countries tried to control the deniers by outlawing their utterances. Most people who thought about it drew obvious lessons from obvious facts: civilization could be savage. Progress was, at best, unreliable. Science had no positive effect on morals. The defeat of Nazism hardly seemed to make the world better. Revealed bit by gruesome bit, the even more massive scale of inhumanity in Stalin’s Russia undermined faith in communism, too, as a solution to the world’s problems.

  Meanwhile, science had a redemptive claim to stake: it helped to bring the war against Japan to an end. In August 1945, American planes dropped atom bombs on Hiroshima and Nagasaki, virtually obliterating them, killing over 220,000 people, and poisoning the survivors with radiation. But how much credit did science deserve? The individuals who took part in making and ‘delivering’ the bomb struggled with conscience – including William P. Reynolds, the Catholic pilot commemorated in the chair I occupy at the University of Notre Dame, and J. Robert Oppenheimer, the mastermind of atomic war research, who retreated into mysticism.13 A gap glared between technology’s power to deliver evil and people’s moral incapacity to resist it.

  ‌From Existentialism to Postmodernism

  New or ‘alternative’ ideas offered refuge for the disillusioned. Oppenheimer turned to readings in Hindu texts. He set, as we shall see, a trend for the rest of the century in the West. To most seekers of relief from failed doctrines, existentialism was even more attractive. It was an old but newly fashionable philosophy that thinkers in Frankfurt – the ‘Frankfurt School’ in current academic shorthand – had developed in the 1930s and 1940s, as they struggled to find alternatives to Marxism and capitalism. They identified ‘alienation’ as the great problem of society, as economic rivalries and short-sighted materialism sundered communities, and left restless, unrooted individuals. Martin Heidegger, the tutelary genius of the University of Marburg, proposed that we can cope by accepting our existence between birth and death as the only immutable thing about us; life could then be tackled as a project of self-realization, of ‘becoming’. Who we are changes as the project unfolds. Individuals, Heidegger contended, are the shepherds, not the creators or engineers, of their own identity. By 1945, however, Heidegger had become tainted by his support for Nazism, and his sensible observations were largely ignored. It fell to Jean-Paul Sartre to relaunch existentialism as a ‘new creed’ for the postwar era.

  ‘Man’, Sartre said, ‘is only a situation’ or ‘nothing else but what he makes of himself … the being who hurls himself toward a future and who is conscious of imagining himself as being in the future.’ Self-modelling was not just a matter of individual choice: every individual action was ‘an exemplary act’, a statement about the sort of species we want humans to be. Yet, according to Sartre, no such statement can ever be objective. God does not exist; everything is permissible, and ‘as a result man is forlorn, without anything to cling to … If existence really does precede essence, there is no explaining things away by reference to a fixed … human nature. In other words, there is no determinism, man is free, man is freedom.’ No ethic is justifiable except acknowledgement of the rightness of this.14 In the 1950s and 1960s, Sartre’s version of existentialism fed the common assumptions of the educated young Westerners whom the Second World War left in command of the future. Existentialists could barricade themselves in self-contemplation: a kind of security in revulsion from an uglified world. Critics who denounced them for decadence were not far wrong in practice: we who were young then used existentialism to justify every form of self-indulgence as part of a project of ‘becoming’ oneself – sexual promiscuity, revolutionary violence, indifference to manners, drug abuse, and defiance of the law were characteristic existentialist vices. Without existentialism, ways of life adopted or imitated by millions, such as beat culture and 1960s permissiveness, would have been unthinkable. So, perhaps, would the late twentieth century’s libertarian reaction against social planning.15

  Of course, not every thinker cowered in egotism, succumbed to philosophies of disillusionment, or surrendered faith in objectively verifiable certainties. Survivors and disciples of the Frankfurt School’s pre-war Viennese rivals were pre-eminent among the enemies of doubt. They fought a long rearguard action on behalf of what they called ‘logical positivism’, which amounted to reaffirmed faith in empirical knowledge and, therefore, in science. I recall watching Freddie Ayer, the Oxford don who was the public face and voice of logical positivism, denouncing the vacuity of metaphysics on television (which was still, in those days, an intelligent and educational medium). In the United States, John Dewey and his followers tried to revive pragmatism as a practical way of getting on with the world, reformulating it in an attempt to fillet out the corrosive relativism in William James’s version (see here).

  A challenge to positivism came from one of its heretical disciples. ‘Van’ Quine was a Midwesterner who detested nonsense. He inherited some of the pragmatism that made the United States great: he wanted philosophy to work in the real, physical, or as he said ‘natural
’ world. He penetrated Plato’s cave, as all philosophy students do, and left it without seeing anything except vapid speculations about unverifiable claims. He was typical of the 1930s, when he started as a professional philosopher: he bowed to science as the queen of the academy and wanted philosophy to be scientific, rather in the way many historians and sociologists wanted to practise ‘social sciences’; like other venerators of scientific means to truth Quine reeled from indeterminacy and recoiled from intuitive thinking. He talked a pared-down vocabulary, from which words he thought toxically vague, such as ‘belief’ and ‘thought’, were cut like cancers, or preserved, like bacilli in dishes or jars, for use as figures of speech. Ideally, one feels, he would have liked to limit communication to sentences expressible in symbolic logical notation. Positivism attracted him, perhaps, because it exalted demonstrable facts and empirical tests. He likened ‘the flicker of a thought’ to ‘the flutter of an eyelid’ and ‘states of belief’ to ‘states of nerves’.16 But the positivists were too indulgent, for his taste, of supposed truths unsusceptible to scientific testing. In two papers he read to fellow-philosophers in 1950 he demolished the basis on which positivists had admitted universal propositions: that, though they could not be proven, they were matters of definition, or usage, or ‘meaning’ – another term he deplored. In the classic example, you can assent to ‘All bachelors are unmarried’ because of what the words mean, whereas you cannot assent to ‘Cliff Richard is a bachelor’ without evidence. Quine condemned the distinction as false. At the core of his argument was his dismissal of ‘meaning’: ‘bachelor’ is a term that stands for ‘unmarried man’ in the sentence in question but is meaningless on its own.

  Why did Quine’s argument matter? It led him to a new way of thinking about how to test the truth of any proposition by relating it to the whole of experience and judging whether it makes sense in or helps us understand the material world. Few readers, however, followed the later stages of his journey. Most inferred one of two mutually contradictory conclusions. Some turned to science to justify such universal statements as can be subjected to sufficient if not conclusive tests, such as the laws of physics or the axioms of mathematics. Others abandoned metaphysics altogether on the grounds that Quine had shown the impossibility of formulating a proposition that is necessarily or inherently true. Either way, science seemed to take over philosophy, like a monopolist cornering the market in truth.17

  Philosophers of language, however, made the projects of positivism and its offshoots seem shallow and unsatisfactory. Ludwig Wittgenstein’s work was emblematic. He was an unruly disciple of Bertrand Russell. He staked his claim to independence in Russell’s seminar at Cambridge University by refusing to admit that there was ‘no hippopotamus under the table’.18 Russell found his intellectual perversity exasperating but admirable. It was a young contrarian’s way of abjuring logical positivism. Wittgenstein went on to evince brilliance unalloyed by knowledge: his method was to think problems out without encumbering his mind by reading the works of the reputable dead.

  In 1953, Wittgenstein published his Philosophical Investigations. The printed pages still have the flavour of lecture notes. But unlike Aristotle and Saussure, Wittgenstein was his own recorder, as if in distrust of his students’ ability to catch his meaning accurately. He left unanswered questions he anticipated from the audience and many dangling prompts and queries to himself. A potentially annihilating virus infected his work. ‘My aim’, Wittgenstein told students, ‘is to teach you to pass from a piece of disguised nonsense to something that is patent nonsense.’ He argued convincingly that we understand language not because it corresponds to reality but because it obeys rules of usage. Wittgenstein imagined a student asking, ‘So you are saying that human agreement decides what is true and what is false?’ And again, ‘Aren’t you at bottom really saying that everything except human behaviour is a fiction?’ These were forms of scepticism William James and Ferdinand de Saussure had anticipated. Wittgenstein tried to distance himself from them: ‘If I do speak of a fiction, it is of a grammatical fiction.’ As we have seen, however, with Poincaré and Gödel, the impact of a writer’s work often exceeds his intention. When Wittgenstein drove a wedge into what he called ‘the model of object and name’, he parted language from meaning.19

  A few years later, Jacques Derrida became Saussure’s most radical interpreter. He was an ingenious thinker whom provincial exile in an unprestigious position turned into a méchant, if not an enragé. In Derrida’s version of Saussure, reading and misreading, interpretation and misinterpretation are indistinguishable twins. The terms of language refer not to any reality that lies beyond them but only to themselves. Because meanings are culturally generated, we get trapped in the cultural assumptions that give meaning to the language we use. In the interests of political correctness, strident programmes of linguistic reform accompanied or followed Derrida’s insight: demands, for instance, to forgo, even in allusion to historical sources, historically abused terms or epithets, such as ‘cripple’ or ‘negro’ or ‘midget’ or ‘mad’; or to impose neologisms, such as ‘differently abled’ or ‘persons of restricted growth’; or the feminist campaign to eliminate terms of common gender (like ‘man’ and ‘he’) on the grounds that they resemble those of masculine gender and imply a prejudice in favour of the male sex.20

  What came to be called postmodernism was more, however, than a ‘linguistic turn’. Language malaise combined with scientific uncertainty to foment distrust in the accessibility – and even the reality – of knowledge. Distressing events and new opportunities provoked revulsion from modernism: the war, genocide, Stalinism, Hiroshima, the tawdry utopias created by the architectural modern movements, the dreariness of the overplanned societies Europeans inhabited in the postwar years. The alienated had to reclaim culture: the breakneck technology of electronically generated entertainment helped them do it.

  In part, against this background, postmodernism looks like a generational effect. The baby boomers could repudiate a failed generation and embrace sensibilities suited to a postcolonial, multicultural, pluralistic era. The contiguities and fragility of life in a crowded world and a global village encouraged or demanded multiple perspectives, as neighbours adopted or sampled each other’s points of view. Hierarchies of value had to be avoided, not because they are false but because they are conflictive. A postmodern sensibility responds to the elusive, the uncertain, the absent, the undefined, the fugitive, the silent, the inexpressible, the meaningless, the unclassifiable, the unquantifiable, the intuitive, the ironic, the inexplicit, the random, the transmutative or transgressive, the incoherent, the ambiguous, the chaotic, the plural, the prismatic: whatever hard-edged modern sensibilities cannot enclose. Postmodernism, according to this approach, arose in accordance with its own predictions about other ‘hegemonic’ ways of thought: it was the socially constructed, culturally engineered formula imposed by our own historical context. In famous lines, Charles Baudelaire defined the modern as ‘the ephemeral, the fugitive, the contingent, the half of art whose other half is the eternal and the immutable’. It is tempting to adapt this phrase and say that the postmodern is the ephemeral, fugitive, and contingent half of the modern, whose other half is the eternal and the immutable.21

  Specific events of the 1960s helped postmodernism crystallize. Students became aware that the prevailing scientific picture of the cosmos was riven by contradictions and that, for example, relativity theory and quantum theory – the most prized intellectual achievements of our century – could not both be correct. The work of Jane Jacobs voiced disillusionment with the modern vision of utopia, embodied in architecture and urban planning.22 Thomas Kuhn and chaos theory completed the scientific counter-revolution of our century. The ordered image of the universe inherited from the past was replaced by the image we live with today: chaotic, contradictory, full of unobservable events, untrackable particles, untraceable causes, and unpredictable effects. The contribution of the Catholic Church – the world’
s biggest and most influential communion – is not often acknowledged. But in the Second Vatican Council, the formerly most confident human repository of confidence dropped its guard: the Church licensed liturgical pluralism, showed unprecedented deference to multiplicity of belief, and compromised its structures of authority by elevating bishops closer to the pope and the laity closer to the priesthood.

  The result of this combination of traditions and circumstances was a brief postmodern age, which convulsed and coloured the worlds of academia and the arts and – in as far as civilization belongs to intellectuals and artists – deserved to be inserted into the roll call of periods into which we divide our history. And yet, if there has been a postmodern age, it seems to have been suitably evanescent. In the 1990s and after, the world passed rapidly from postmodernism to ‘postmortemism’. Ihab Hassan, the literary critic whom postmodernists hailed as a guru, recoiled in ennui and denounced his admirers for taking ‘the wrong turn’.23 Jean-François Lyotard, the Derrida disciple and philosophical farceur who was another postmodernist hero, turned with a shrug or a moue, telling us – ironically, no doubt – that it was all a joke. Derrida himself rediscovered the virtues of Marxism and embraced its ‘spectres’. Redefinitions of postmodernism by the astonishing polymath Charles Jencks (whose work as a theorist and practitioner of architecture helped popularize the term in the 1970s) gutted some supposedly defining features: he proposed reconstruction to replace deconstruction, excoriated pastiche, and rehabilitated canonical modernists in art, architecture, and literature. Many postmodernists seem to have yielded something to ‘the return of the real’.24

  ‌The Crisis of Science

  Disenchantment with science deepened. ‘Modern societies’, according to the French geneticist Jacques Monod in 1970, ‘have become as dependent on science as an addict on his drug.’25 Addicts can break their habits. In the late twentieth century, a breaking point approached.

 

‹ Prev