Fundamentalism is modern: recent in origin, even more recent in appeal. Counterintuitive as these assertions may seem, it is not hard to see why fundamentalism arose and throve in the modern world and has never lost its appeal. According to Karen Armstrong, one of the foremost authorities on the subject, it is also scientific, at least in aspiration, because it treats religion as reducible to matters of incontrovertible fact.60 It is charmless and humdrum – religion stripped of the enchantment. It represents modernity, imitates science, and reflects fear: fundamentalists express fear of the end of the world, of ‘Great Satans’ and ‘Antichrists’, of chaos, of the unfamiliar, and above all of secularism.
Though they belong in different traditions, their shared excesses make them recognizable: militancy, hostility to pluralism, and a determination to confuse politics with religion. Militants among them declare war on society. Yet most fundamentalists are nice, ordinary people who make do with the wicked world and, like most of the rest of us, leave religion at the door of their church or mosque.
Nonetheless, fundamentalism is pernicious. Doubt is a necessary part of any deep faith. ‘Lord, help my unbelief’ is a prayer every intellectual Christian should take from St Anselm. Anyone who denies doubt should hear the cock crow thrice. Reason is a divine gift; to suppress it – as the eighteenth-century Muggletonian Protestants did in the belief that it is a diabolical trap – is a kind of intellectual self-amputation. Fundamentalism, which demands a closed mind and the suspension of critical faculties, therefore seems irreligious to me. Protestant fundamentalism embraces an obvious falsehood: that the Bible is unmediated by human hands and weaknesses. Fundamentalists who read justifications of violence, terrorism, and bloodily enforced moral and intellectual conformity in their Bible or Qur’an wantonly misconstrue their own sacred texts. There are fundamentalist sects whose ethic of obedience, paranoid habits, crushing effects on individual identity, and campaigns of hatred or violence against supposed enemies recall early fascist cells. If and when they get power, they make life miserable for everybody else. Meanwhile, they hunt witches, burn books, and spread terror.61
Fundamentalism undervalues variety. An equal and opposite response to uncertainty is religious pluralism, which had a similar, century-long story in the twentieth century. Swami Vivekananda, the great spokesman of Hinduism and apostle of religious pluralism, uttered the call before he died in 1902, when the collapse of certainty was still unpredictable. He extolled the wisdom of all religions and recommended ‘many ways to one truth’. The method has obvious advantage over relativism: it encourages diversification of experience – which is how we learn and grow. It overtrumps relativism in its appeal to a multicultural, pluralistic world. For people committed to a particular religion, it represents a fatal concession to secularism: if there is no reason to prefer one religion over the others, why should purely secular philosophies not be equally good ways to follow? On the ride along the multi-faith rainbow, why not add more gradations of colour?62
Where religions dominate, they become triumphalist. In retreat, they perceive the advantages of ecumenism. Where they rule, they may persecute; where they are persecuted they clamour for toleration. After losing nineteenth-century struggles against secularism (see here) rival Christian communions began to evince desire for ‘wide ecumenism’, bringing together people of all faiths. The Edinburgh Conference of 1910, which attempted to get Protestant missionary societies to co-operate, launched the summons. The Catholic Church remained aloof even from Christian ecumenism until the 1960s, when shrinking congregations induced a mood of reform. In the late twentieth century, extraordinary ‘holy alliances’ confronted irreligious values, with Catholics, Southern Baptists, and Muslims, for instance, joining forces to oppose relaxation of abortion laws in the United States, or collaborating in attempts to influence the World Health Organization policy on birth control. Interfaith organizations worked together to promote human rights and restrain genetic engineering. Ill-differentiated faith opened new political niches for public figures willing to speak for religion or bid for religious voters. US President Ronald Reagan, unaware, it seems, of the self-undermining nature of his recommendation, urged his audience to have a religion, but thought it did not matter which. The Prince of Wales proposed himself in the role of ‘Defender of Faith’ in multicultural Britain.
Religious pluralism has an impressive recent past. But can it be sustained? The scandal of religious hatreds and mutual violence, which so disfigured the past of religions, appears conquerable. With every inch of common ground religions find between them, however, the ground of their claims to unique value shrinks.63 To judge from events so far in the twenty-first century, intrafaith hatreds are more powerful than interfaith love-ins. Shiite and Sunni dogmatists massacre each other. Liberal Catholics seem on most social questions to have more in common with secular humanists than with their ultramontane coreligionists or with conservative Protestants. Muslims are the victims of a Buddhist jihad in Myanmar. Christians face extermination or expulsion by Daeshist fanatics in parts of Syria and Iraq. Wars of religion keep spreading ruin and scattering ban, like Elizabeth Barrett Browning’s cack-hoofed god, to the bafflement of the secular gravediggers who thought they had buried the gods of bloodlust long ago.
Religious pluralism has secular counterparts, with similarly long histories. Even before Franz Boas and his students began accumulating evidence for cultural relativism, early signs of a possible pluralist future emerged in a place formerly outside the mainstream. Cuba seemed behind the times for most of the nineteenth century: slavery lingered there. Independence was often proposed and always postponed. But with Yankee help the revolutionaries of 1898 finally cracked the Spanish Empire and resisted US takeover. In newly sovereign Cuba, intellectuals faced the problem of extracting a nation out of diverse traditions, ethnicities, and pigments. Scholarship – first by the white sociologist Fernando Ortiz, then, increasingly, by blacks – treated black cultures on terms of equality with white. Ortiz began to appreciate blacks’ contribution to the making of his country when he interviewed prison internees in an attempt to profile criminals. Coincidentally, as we have seen, in the United States and Europe, white musicians discovered jazz and white artists began to esteem and imitate ‘tribal’ art. In French West Africa in the 1930s, ‘Négritude’ found brilliant spokesmen in Aimé Césaire and Léon Damas. The conviction grew and spread that blacks were the equals of whites – maybe in some ways their superiors or, at least, predecessors – in all the areas of achievement traditionally prized in the West. The discovery of black genius stimulated independence movements for colonized regions. Civil rights campaigners suffered and strengthened in South Africa and the United States, where blacks were still denied equality under the law, and wherever racial prejudice and residual forms of social discrimination persisted.64
In a world where no single system of values could command universal reverence, universal claims to supremacy crumbled. The retreat of white empires from Africa in the late 1950s and 1960s was the most conspicuous result. Archaeology and palaeoanthropology adjusted to postcolonial priorities, unearthing reasons for rethinking world history. Tradition had placed Eden – the birthplace of humankind – at the eastern extremity of Asia. There was nowhere east of Eden. By placing the earliest identifiably human fossils in China and Java, early-twentieth-century science seemed to confirm this risky assumption. But it was wrong. In 1959, Louis and Mary Leakey found remains of a tool-making, manlike creature 1.75 million years old in Olduvai Gorge in Kenya. Their find encouraged Robert Ardrey in a daring idea: humankind evolved uniquely in East Africa and spread from there to the rest of the world. More Kenyan and Tanzanian ancestors appeared. The big-brained Homo habilis emerged in the early 1960s. In 1984 a skeleton of a later hominid, Homo erectus, showed that hominids of a million years ago had bodies so like those of modern people that one might hardly blink to share a bench or a bus-ride with a million-year-old revenant. Even more humbling was the excavation Donald Joha
nson made in Ethiopia in 1974: he called his three-million-year-old bipedal hominid ‘Lucy’ in allusion to a currently popular song in praise of ‘Lucy in the Sky with Diamonds’ – lysergic acid, which induced cheap hallucinations: that is how mind-bending the discovery seemed at the time. The following year basalt tools two and a half million years old turned up nearby. Footprints of bipedal hominids, dating back 3.7 million years, followed in 1977. Archaeology seemed to vindicate Ardrey’s theory. While Europeans retreated from Africa, Africans edged Eurocentrism out of history.
Most nineteenth-century theorists favoured ‘unitary’ states, with one religion, ethnicity, and identity. In the aftermath of imperialism, however, multiculturalism was essential for peace. Redrawn frontiers, uncontainable migrations, and proliferating religions made uniformity unattainable. Superannuated racism made homogenizing projects practically unrealizable and morally indefensible. States that still strove for ethnic purity or cultural consistency faced traumatic periods of ‘ethnic cleansing’ – the standard late-twentieth-century euphemism for trails of tears and pitiless massacre. Meanwhile, rival ideologies competed in democracies, where the only way of keeping peace between them was political pluralism – admitting parties with potentially irreconcilable views to the political arena on equal terms.
Large empires have always encompassed different peoples with contrasting ways of life. Usually, however, each has had a dominant culture, alongside which others are tolerated. In the twentieth century, mere toleration would no longer be enough. Enmity feeds dogmatism: you can only insist on the unique veracity of your opinions if an adversary disputes them. If you want to rally adherents for irrational claims, you need a foe to revile and fear. But in a multi-civilizational world, composed of multicultural societies, shaped by massive migrations and intense exchanges of culture, enmity is increasingly unaffordable. We need an idea that will yield peace and generate co-operation. We need pluralism.
In philosophy, pluralism means the doctrine that monism and dualism (see here and here) cannot encompass reality. This claim, well documented in antiquity, has helped to inspire a modern conviction: that a single society or a single state can accommodate, on terms of equality, a plurality of cultures – religions, languages, ethnicities, communal identities, versions of history, value systems. The idea grew up gradually. Real experience instanced it before anyone expressed it: almost every big conquest-state and empire of antiquity, from Sargon’s onward, exemplified it. The best formulation of it is usually attributed to Isaiah Berlin, one of the many nomadic intellectuals whom twentieth-century turbulence scattered across the universities of the world – in his case from his native Latvia to an honoured place in Oxford common rooms and London clubs. ‘There is’, he explained,
a plurality of values which men can and do seek, and … these values differ. There is not an infinity of them: the number of human values, of values that I can pursue while maintaining my human semblance, my human character, is finite – let us say 74, or perhaps 122, or 26, but finite, whatever it may be. And the difference it makes is that if a man pursues one of these values, I, who do not, am able to understand why he pursues it or what it would be like, in his circumstances, for me to be induced to pursue it. Hence the possibility of human understanding.
This way of looking at the world differs from cultural relativism: pluralism does not, for instance, have to accommodate obnoxious behaviour, or false claims, or particular cults or creeds that one might find distasteful: one might exclude Nazism, say, or cannibalism. Pluralism does not proscribe comparisons of value: it allows for peaceful argument about which culture, if any, is best. It claims, in Berlin’s words, ‘that the multiple values are objective, part of the essence of humanity rather than arbitrary creations of men’s subjective fancies’. It helps make multicultural societies conceivable and viable. ‘I can enter into a value system which is not my own’, Berlin believed. ‘For all human beings must have some common values … and also some different values.’65
Ironically, pluralism has to accommodate anti-pluralism, which still abounds. In revulsion from multiculturalism in the early years of the twenty-first century, policies of ‘cultural integration’ attracted votes in Western countries, where globalization and other huge, agglutinative processes made most historic communities defensive about their own cultures. Persuading neighbours of contrasting cultures to coexist peacefully got harder everywhere. Plural states seemed fissile: some split violently, like Serbia, Sudan, and Indonesia. Others experienced peaceful divorces, like the Czech Republic and Slovakia, or renegotiated the terms of cohabitation, like Scotland in the United Kingdom or Catalonia and Euzkadi in Spain. Still, the idea of pluralism endured, because it promises the only practical future for a diverse world. It is the only truly uniform interest that all the world’s peoples have in common. Paradoxically, perhaps, pluralism is the one doctrine that can unite us.66
Prospect
The End of Ideas?
Memory, imagination, and communication – the faculties that generated all the ideas covered in the book so far – are changing under the impact of robotics, genetics, and virtual socialization. Will our unprecedented experience provoke or facilitate new ways of thinking and new thoughts? Will it impede or extinguish them?
I am afraid that some readers may have started this book optimistically, expecting the story to be progressive and the ideas all to be good. The story has not borne such expectations out. Some of the findings that have unfolded, chapter by chapter, are morally neutral: that minds matter, that ideas are the driving force of history (not environment or economics or demography, though they all condition what happens in our minds); that ideas, like works of art, are products of imagination. Other conclusions subvert progressive illusions: a lot of good ideas are very old and bad ones very new; ideas are effective not because of their merits, but because of circumstances that make them communicable and attractive; truths are less potent than falsehoods people believe; the ideas that come out of our minds can make us seem out of our minds.
God protect me from the imps of optimism, whose tortures are subtler and more insidious than the predictable miseries of pessimism. Optimism is almost always a traitress. Pessimism indemnifies you against disappointment. Many, perhaps most, ideas are evil or delusive or both. One reason why there are so many ideas to tell of in this book is that every idea successfully applied has unforeseen consequences that are often malign and require more thinking in response. The web creates cyber-ghettoes in which the like-minded shut out or ‘unfriend’ opinions other than their own: if the habit spreads far enough, it will cut users off from dialogue, debate, and disputation – the precious sources of intellectual progress. The biggest optimists are so deeply self-traduced as to defy satire, imagining a future in which humans have genetically engineered themselves into immortality, or download consciousness into inorganic machines to protect our minds from bodily decay, or zoom through wormholes in space to colonize worlds we have, as yet, had no chance to despoil or render uninhabitable.1
Still, some pessimism is excessive. According to the eminent neuroscientist Susan Greenfield, the prospects for the future of the human mind are bleak. ‘Personalization’, she says, converts brain into mind. It depends on memories uneroded by technology and experience unimpeded by virtuality. Without memories to sustain our narratives of our lives and real experiences to shape them, we shall stop thinking in the traditional sense of the word and re-inhabit a ‘reptilian’ phase of evolution.2 Plato’s character Thamus expected similar effects from the new technology of his time (see here). His predictions proved premature. Greenfield is, perhaps, right in theory, but, on current showing, no machine is likely to usurp our humanity.
Artificial intelligence is not intelligent enough or, more exactly, not imaginative enough or creative enough to make us resign thinking. Tests for artificial intelligence are not rigorous enough. It does not take intelligence to meet the Turing test – impersonating a human interlocutor – or win a game of
chess or general knowledge. You will know that intelligence is artificial only when your sexbot says, ‘No.’ Virtual reality is too shallow and crude to make many of us abandon the real thing. Genetic modification is potentially powerful enough – under a sufficiently malign and despotic elite – to create a lumpen race of slaves or drones with all critical faculties excised. But it is hard to see why anyone should desire such a development, outside the pages of apocalyptic sci-fi, or to expect the conditions to be fulfilled. In any case, a cognitive master class would still be around to do the pleb’s thinking for it.
So, for good and ill, we shall go on having new thoughts, crafting new ideas, devising innovative applications. I can envisage, however, an end to the acceleration characteristic of the new thinking of recent times. If my argument is right, and ideas multiply in times of intense cultural interchange, whereas isolation breeds intellectual inertia, then we can expect the rate of new thinking to slacken if exchanges diminish. Paradoxically, one of the effects of globalization will be diminished exchange, because in a perfectly globalized world, cultural interchange will erode difference and make all cultures increasingly like each other. By the late twentieth century, globalization was so intense that it was almost impossible for any community to opt out: even resolutely self-isolated groups in the depths of the Amazon rainforest found it hard to elude contact or withdraw from the influence of the rest of the world once contact was made. The consequences included the emergence of global culture – more or less modelled on the United States and Western Europe, with people everywhere wearing the same clothes, consuming the same goods, practising the same politics, listening to the same music, admiring the same images, playing the same games, crafting and discarding the same relationships, and speaking or trying to speak the same language. Of course, global culture has not displaced diversity. It is like a beekeeper’s mesh, under which a lot of culture pullulates. Every agglutinative episode provokes reactions, with people reaching for the comfort of tradition and trying to conserve or revive threatened or vanished lifeways. But over the long term, globalization does and will encourage convergence. Languages and dialects disappear or become subjects of conservation policies, like endangered species. Traditional dress and arts retreat to margins and museums. Religions expire. Local customs and antiquated values die or survive as tourist attractions.
Out of Our Minds Page 50