Out of Our Minds

Home > Other > Out of Our Minds > Page 34
Out of Our Minds Page 34

by Felipe Fernandez-Armesto


  At first, the impact was barely perceptible. Gradually, however, in the nineteenth and twentieth centuries, growing numbers of men grew to like feminism as a justification for reincorporating women into the labour market and so exploiting their productivity more effectively. After two world wars had demonstrated the need for and efficacy of contributions by the sex in realms formerly reserved or privileged for men, it became fashionable for people of both sexes to extol viragos and acclaim women’s suitability for what was still thought of as ‘men’s work’ in demanding jobs. Simone de Beauvoir, Jean-Paul Sartre’s undomesticable paramour, had launched a new message one day in 1946, when, she said, ‘I began to reflect about myself and it struck me with a sort of surprise that the first thing I had to say was, “I am a woman”.’51 In the second half of the century, at least in the West and among communities elsewhere whom Western intellectual fashions influenced, the idea became current that women could discharge the responsibilities of leadership in every field, not because they were like men, but because they were human – or even, in the minds of feminists we might call extremists, because they were female.

  Some feminists claimed to be able to force men to change the rules in women’s favour. More commonly, they addressed women, whom they urged to make the most of changes and opportunities that would have occurred anyway. Unintended effects followed: by competing with men, women gave up some traditional advantages – male deference and much informal power; by joining the workforce, they added another level of exploitation to their roles as homemakers and mothers, with consequent stress and overwork. Some women, who wanted to remain at home and dedicate themselves to their husbands and children, found themselves doubly disadvantaged: exploited by men and pilloried by ‘sisters’. Society still needs to strike the right balance: liberating all women to lead the lives they want, without having to conform to roles devised for them by intellectuals of either sex.

  ‌Gropings toward Democracy

  Popular sovereignty, equality, universal rights, general will: the logical conclusion of the Enlightenment, in these respects, was democracy. Briefly, in 1793, revolutionary France had a democratic constitution, largely drafted by Condorcet. Universal male suffrage (Condorcet wanted to include women but yielded to colleagues’ alarms) with frequent elections and provision for a plebiscite were the essential ingredients. But, in democracy, in Diderot’s words, ‘the people may be mad but is always master’.52 Madness and mastery make a daunting combination. Democracy without the rule of law is tyranny. Even before France’s revolutionary constitution came into effect, a coup brought Maximilien de Robespierre to power. In emergency conditions of war and terror, virtue – Robespierre’s word for brute force – supplied the decisive direction that reason could not provide. The Constitution was suspended after barely four months. Terror doused the Enlightenment in blood. It took nearly a hundred years for most European elites to overcome the abhorrence the very name of democracy inspired. The fate of revolutionary democracy in France foreshadowed twentieth-century examples: the electoral successes of fascism, Nazism, communism, and charismatic embodiments of personality-cults, the abuse of plebiscite and referendum, the miseries of living in ‘people’s democracies’.

  America, however, was relatively isolated from the horrors that extinguished the Enlightenment in Europe. The US Constitution was constructed along principles rather like those Condorcet followed, with a similar degree of indebtedness to the Enlightenment. With little violence, except in the slave states, progressive extensions of the franchise could and did genuinely turn the United States into a democracy. Eventually, most of the rest of the world came to follow this reassuring model, which seemed to show that the common man could take power without dealing it to dictators or dipping it in blood. The democracy we know today – representative government elected by universal or near-universal suffrage under the rule of law – was a US invention. Attempts to trace it from the ancient Greek system of the same name or from the French Revolution are delusively romantic. What made it peculiarly American is much disputed. Radical Protestantism, which throve more in the United States than in the old countries from which radicals fled, may have contributed traditions of communal decision making and the rejection of hierarchy;53 the frontier, where escapees from authority accumulated and communities had to regulate themselves, may have helped.54 Surely decisive was the fact that the Enlightenment, with its respect for popular sovereignty and folk wisdom, survived in the United States when it failed in Europe.

  While democracy matured in America, almost everyone in Europe decried it. Prudent thinkers hesitated to recommend a system Plato and Aristotle had condemned. Rousseau detested it. As soon as representatives are elected, he thought, ‘the people is enslaved … If there were a nation of gods, it would govern itself democratically. A government so perfect is not suited to men.’55 Edmund Burke – the voice of political morality in late-eighteenth-century England – called the system ‘the most shameless in the world’.56 Even Immanuel Kant, who was once an advocate, reneged on democracy in 1795, branding it as the despotism of the majority. The political history of Europe in the nineteenth century is of ‘mouldering edifices’ propped up and democracy deferred, where the elite felt the terror of the tumbril and the menace of the mob.

  In the US, by contrast, democracy ‘just growed’: it took European visitors to observe it, refine the idea, and recommend it convincingly to readers back home. By the time the sagacious French observer, Alexis de Tocqueville, went to America to research democracy in the 1830s, the United States had, for the time, an exemplary democratic franchise (except in Rhode Island, where property qualifications for voters were still fairly stringent) in the sense that almost all adult white males had the vote. Tocqueville was wise enough, however, to realize that democracy meant something deeper and subtler than a wide franchise: ‘a society which all, regarding the law as their work, would love and submit to without trouble’, where ‘a manly confidence and a sort of reciprocal condescension between the classes would be established, as far from haughtiness as from baseness’. Meanwhile, ‘the free association of citizens’ would ‘shelter the state from both tyranny and licence’. Democracy, he concluded, was inevitable. ‘The same democracy reigning in American societies’ was ‘advancing rapidly toward power in Europe’, and the obligation of the old ruling class was to adapt accordingly: ‘to instruct democracy, reanimate its beliefs, purify its mores, regulate its movements’ – in short, to tame it without destroying it.57

  America never perfectly exemplified the theory. Tocqueville was frank about the shortcomings, some of which are still evident: government’s high costs and low efficiency; the venality and ignorance of many public officials; inflated levels of political bombast; the tendency for conformism to counterbalance individualism; the menace of an intellectually feeble pantheism; the peril of the tyranny of the majority; the tension between crass materialism and religious enthusiasm; the threat from a rising, power-hungry plutocracy. James Bryce, Oxford’s Professor of Jurisprudence, reinforced Tocqueville’s message in the 1880s. He pointed out further defects, such as the way the system corrupted judges and sheriffs by making them bargain for votes, but he recommended the US model as both inevitable and desirable. The advantages of democracy outweighed the defects. They could be computed in dollars and cents, and measured in splendid monuments erected in newly transformed wildernesses. Achievements included the strength of civic spirit, the spread of respect of law, the prospect of material progress, and, above all, the liberation of effort and energy that results from equality of opportunity. In the last three decades of the nineteenth century and the first of the twentieth, constitutional reforms would edge most European states and others in Japan and in former European colonies towards democracy on the representative lines the United States embodied.58

  Revolutionary disillusionment made it plain that liberty and equality were hard to combine. Equality impedes liberty. Liberty produces unequal results. Science, meanwhile, expo
sed another contradiction at the heart of the Enlightenment: freedom was at odds with the mechanistic view of the universe. While political thinkers released chaotic freedoms in societies and economies, scientists sought order in the universe and tried to decode the workings of the machine: a well-regulated system, in which, if one knew its principles, one could make accurate predictions and even control outcomes.

  ‌Truth and Science

  Until the eighteenth century, most work in the sciences started with the assumption that external realities act on the mind, which registers data perceived through the senses. The limitations of this theory were obvious to thinkers in antiquity. We find it hard to challenge the evidence of our senses, because we have nothing else to go on. But senses may be the only reality there is. Why suppose that something beyond them is responsible for activating them? Toward the end of the seventeenth century, John Locke dismissed such objections or, rather, refused to take them seriously. He helped found a tradition of British empiricism that asserts simply that what is sensed is real. He summed his view up: ‘No man’s knowledge here’ – by which Locke meant in the world we inhabit – ‘can go beyond his experience.’59

  For most people the tradition has developed into an attitude of commonsensical deference to evidence: instead of starting with the conviction of our own reality and doubting everything else, we should start from the assumption that the world exists. We then have a chance of making sense of it. Does empiricism imply that we can know nothing except by experience? Locke thought so; but it is possible to have a moderately empirical attitude, and hold that while experience is a sound test for knowledge, there may be facts beyond the reach of such a test. Locke’s way of formulating the theory dominated eighteenth-century thinking about how to distinguish truth from falsehood. It jostled and survived among competing notions in the nineteenth. In the twentieth, Locke’s philosophy enjoyed a new vogue, especially among Logical Positivists, whose school, founded in Vienna in the 1920s, demanded verification by sense-data for any claim deemed meaningful. The main influence, however, of the tradition Locke launched has been on science rather than philosophy: the eighteenth-century leap forward in science was boosted by the respect sense-data commanded. Scientists have generally favoured an empirical (in Locke’s sense) approach to knowledge ever since.60

  Science extended the reach of the senses to what had formerly been too remote or too occluded. Galileo spotted the moons of Jupiter through his telescope. Tracking the speed of sound, Marin Mersenne heard harmonics that no one had previously noticed. Robert Hooke sniffed ‘nitre-air’ in the acridity of vapour from a lighted wick, before Antoine Lavoisier proved the existence of oxygen by isolating it and setting it on fire. Antonie van Leeuwenhoek saw microbes through his microscope. Newton could wrest the rainbow from a shaft of light or detect the force that bound the cosmos in the weight of an apple. Luigi Galvani felt the thrill of electricity in his fingertips and made corpses twitch at the touch of current. Friedrich Mesmer thought hypnotism was a kind of measurable ‘animal magnetism’. Through life-threatening demonstrations with kite and keys, Benjamin Franklin (1706–90) showed that lightning is a kind of electricity. Their triumphs made credible the cry of empiricist philosophers: ‘Nothing unsensed can be present to the mind!’

  Scientific commitment and practical common sense were part of the background of the so-called ‘Industrial Revolution’ – the movement to develop mechanical methods of production and mobilize energy from new sources of power. Although industrialization was not an idea, mechanization, in some sense, was. In part, its origins lie in the surprising human ability to imagine slight sources of power generating enormous force, just as threadlike sinews convey the strength of the body. Steam, the first such power-source in nature to be ‘harnessed’ to replace muscles, was a fairly obvious case: you can see it and feel its heat, even though it takes some imagination to believe that it can work machinery and impel locomotion. In the 1760s James Watt applied a discovery of ‘pure’ science – atmospheric pressure, which is invisible and undetectable except by experiment – to make steam power exploitable.61

  Germs were perhaps the most astonishing of the previously invisible agents that the new science uncovered. Germ theory was an idea equally serviceable to theology and science. It made the origins of life mysterious, but illuminated the causes of decay and disease. Life, if God didn’t devise it, must have arisen from spontaneous generation. At least, that was what everybody – as far as we know – thought for thousands of years, if they thought about it at all. For ancient Egyptians, life came out of the slime of the Nile’s first flood. The standard Mesopotamian narrative resembles the account favoured by many modern scientists: life took shape spontaneously in a swirling primeval soup of cloud and condensation mixed with a mineral element, salt. To envisage the ‘gods begotten by the waters’, Sumerian poets turned to the image of the teeming alluvial mud that flooded up from the Tigris and Euphrates: the language is sacral, the concept scientific. Challenged by theology, common sense continued to suggest that the mould and worms of putrescence generate spontaneously.

  When microbes became visible under the microscope of Antonie van Leeuwenhoek, it hardly, therefore, seemed worth asking where they come from. The microbial world, with its apparent evidence of spontaneous generation, cheered atheists. The very existence of God – or at least, the validity of claims about his unique power to create life – was at stake. Then, in 1799, with the aid of a powerful microscope, Lazzaro Spallanzani observed fission: cells reproduced by splitting. He demonstrated that if bacteria – or animalculi, to use the term favoured at the time, or germs, as he called them – were killed by heating, they could not reappear in a sealed environment. ‘It appeared’, as Louis Pasteur later put it, ‘that the ferments, properly so-called, are living beings, that the germs of microscopic organisms abound in the surface of all objects, in the air and in water; that the theory of spontaneous generation is chimerical.’62 Spallanzani concluded that living organisms did not appear from nowhere: they could only germinate in an environment where they were already present. No known case of spontaneous generation of life was left in the world.

  Science is still grappling with the consequences. As far as we know, the one-celled life forms called archaea were the first on our planet. The earliest evidence of them dates from at least half a billion years after the planet began. They were not always around. So where did they come from? Egyptian and Sumerian science postulated a chemical accident. Scrutineers are still looking for the evidence, but, so far, without avail.

  Germ theory also had enormous practical consequences: almost at once, it transformed the food industry by suggesting a new way of preserving foods in sealed containers. In the longer term, it opened the way for the conquest of many diseases. Germs, it became clear, sickened bodies as well as corrupting food.63

  To some extent, the success of science encouraged mistrust of religion. The evidence of the senses was all true. Real objects caused it (with exceptions that concerned sound and colour and that experiments could confirm). Jangling, for instance, is proof of the bell, as heat is of the proximity of fire, or stench of the presence of gas. From Locke, eighteenth-century radicals inherited the conviction that it was ‘fiddling’ to waste time thinking about what, if anything, lay beyond the scientifically observed world. But this attitude, which we would now call scientism, did not satisfy all its practitioners. The Scottish philosopher David Hume, born a few years after Locke’s death, agreed that it is nonsense to speak of the reality of anything imperceptible, but pointed out that sensations are not really evidence of anything except themselves – that objects cause them is an unverifiable assumption. In the 1730s the renowned London preacher, Isaac Watts, adapted Locke’s work for religious readers, exalting wordless ‘judgement’ – detectable in feeling yet inexpressible in thought – alongside material perceptions. Towards the end of the century, Kant inferred that the structure of the mind, rather than any reality outside it, determines the only w
orld we can know. Meanwhile, many scientists, like Maupertuis, drifted back from atheism toward religion, or became more interested in speculation about truths beyond the reach of science. Spallanzani’s work restored to God a place in launching life. Churches, moreover, knew how to defeat unbelievers. Censorship did not work. But appeals, over the intellectuals’ heads, to ordinary people did. Despite the hostility of the Enlightenment, the eighteenth century was a time of tremendous religious revival in the West.

  ‌Religious and Romantic Reactions

  Christianity reached a new public. In 1722, Nikolaus Ludwig, Count von Zinzendorf, experienced an unusual sense of vocation. On his estate in eastern Germany he built the village of Herrnhut (‘the Lord’s keeping’) as a refuge where persecuted Christians could share a sense of the love of God. It became a centre from which radiated evangelical fervour – or ‘enthusiasm’, as they called it. Zinzendorf’s was only one of innumerable movements in the eighteenth century to offer ordinary people an affective, unintellectual solution to the problems of life: proof that, in their way, feelings are stronger than reason and that for some people religion is more satisfying than science. As one of the great inspirers of Christian revivalism, Jonathan Edwards of Massachusetts, said, ‘Our people do not so much need … heads stored, as … hearts touched.’ His congregations purged their emotions in ways intellectuals found repellent. ‘There was a great moaning’, observed a witness to one of Edwards’s sermons, ‘so that the minister was obliged to desist – the shrieks and cries were piercing and amazing.’64

  Preaching was the information technology of all these movements. In 1738, with a ‘heart strangely warmed’, John Wesley launched a mission to workers in England and Wales. He travelled eight thousand miles a year and preached to congregations of thousands at a time in the open air. He communicated a mood rather than a message – a sense of how Jesus can change lives by imparting love. Isaac Watts’s friend George Whitfield held meetings in Britain’s American colonies, where ‘many wept enthusiastically like persons that were hungering and thirsting after righteousness’ and made Boston seem ‘the gate of heaven’.65 Catholic evangelism adopted similar means to target the same enemies: materialism, rationalism, apathy, and formalized religion. Among the poor in Naples Alfonso Maria de Liguori seemed like a biblical prophet. In 1765 the pope authorized devotion to the Sacred Heart of Jesus – a bleeding symbol of divine love. Cynically, some European monarchs collaborated with religious revival as a means to distract people from politics and to exploit churches as agents of social control. King Frederick the Great of Prussia, a freethinker who liked to have philosophers at his dinner table, favoured religion for his people and his troops. In founding hundreds of military chaplaincies and requiring religious teaching in schools, he was applying the recommendation of his sometime friend, Voltaire: ‘If God did not exist, it would be necessary to invent him.’ Voltaire was more concerned that God should constrain kings than commoners, but to ‘leave hopes and fear intact’ was, he realized, the best formula for social peace.66

 

‹ Prev