Book Read Free

Out of Our Minds

Page 43

by Felipe Fernandez-Armesto


  Bergson insisted on further insights that helped shape mainstream thinking for much of the twentieth century. He observed, for instance, that reality and experience are identical. ‘There is change’, he said, ‘but there are no “things” that change’, just as a melody is independent of the strings that play it or the stave on which it is written. Change exists, but only because we experience it. And experience, Bergson argued, in common with most philosophers and in defiance of materialists, is a mental process. Our senses transmit it; our brains register it; but it happens elsewhere in a transcendent part of the self that we call ‘mind’. Even more significantly for the future, Bergson prepared the way for Einstein and made his paths straight. It is hard to imagine a theory as subversive as that of relativity penetrating undisarmed minds. Bergson accustomed his readers to the idea that time might not be the absolute, external reality scientists and philosophers had formerly supposed. It might be ‘all in the mind’. In a world jarred by Bergson’s thinking, Einstein’s idea that time could change with the speed of the observer was only a little more shocking. Bergson also anticipated many of the inessential tics of Einstein’s thinking, down to his fondness for analogies with trains. Explaining duration, for instance, he pointed out that we tend ‘to think of change as a series of states that succeed each other’, like travellers in a train who think they have stopped, because another train is passing at the same speed in the opposite direction. A false perception seems to arrest a continuous process.

  A young French mathematician, Henri Poincaré, who was Bergson’s ally in the exposure of chaos, supplied the link that led to Einstein. Poincaré shook the underpinnings of the Newtonian cosmos when, in the late 1890s, he sketched the beginnings of a new scientific paradigm. He was working on one of the problems modern science had been unable to solve: how to model the motions of more than two interdependent celestial bodies. His solution exposed the inadequacy of Newtonian assumptions. He proposed a double wave bending infinitely back on itself and intersecting infinitely. He prefigured the way science came to represent the cosmos more than half a century after his time, in the 1960s and 1970s, when, as we shall see, chaos theory and work on fractals recalled Poincaré’s discovery with stunning immediacy. Poincaré nudged science towards complex, recursive, and chaotic pictures of how nature works.

  He went on to question the basic assumption of scientific method: the link between hypothesis and evidence. He pointed out that scientists have their own agendas. Any number of hypotheses could fit experimental results. Scientists chose among them by convention, or even according to ‘the idiosyncrasies of the individual’.8 Poincaré cited Newton’s laws, among examples, including the traditional notions of space and time. To impugn Newton was shocking enough. To call out space and time was even more perplexing, because they had pretty much always been accepted as part of the fixtures and fittings of the universe. For St Augustine, constant time was the framework of creation. Newton assumed that the same chronometers and yardsticks could measure time and space all over the universe. When Kant developed his theory of intuition at the beginning of the nineteenth century (see here), his key examples of what we know to be true, independently of reason, were the absolute nature of time and space. Like a heretic picking apart a creed, Poincaré provided reasons for doubting everything formerly regarded as demonstrable. He likened Newton to ‘an embarrassed theologian … chained’ to contradictory propositions.9

  He became an international celebrity, widely sought, widely reported. His books sold in scores of thousands. He frequented popular stages, like a modern-day tele-expert haunting the chat shows. As usual when a subtle thinker becomes a public darling, audiences seemed to hear more than he said. Unsurprisingly, in consequence, he claimed to be misunderstood. Readers misinterpreted Poincaré to mean – to quote his own disavowal – that ‘scientific fact was created by the scientist’ and that ‘science consists only of conventions … Science therefore can teach us nothing of the truth; it can only serve us as a rule of action.’10 Science, reverberating in the eardrums of Poincaré’s audiences, seemed to yield insights no more verifiable than, say, poetry or myth. But the history of science is full of fruitful misunderstandings: Poincaré was important for how people read him, not for what he failed to communicate. Between them, Bergson and Poincaré startled the world into uncertainty and softened resistance to radical responses. One of the beneficiaries of the new mood was Albert Einstein.

  Poincaré published his critique of traditional scientific thinking in 1902. Three years later, Einstein emerged from the obscurity of his dead-end job, like a burrower from a mine, to detonate a terrible charge. He worked as a second-class technical officer in the Swiss Patent Office. Donnish jealousy had excluded him from an academic career. This was perhaps just as well. Einstein owed no debts to sycophancy and felt no obligation to defend established professors’ mistakes. Independence from academic constraints freed him to be original. In the world Bergson and Poincaré created he was assured of an audience.

  The theory of relativity changed the world by changing the way we picture it. In the 1890s, experiments had detected perplexing anomalies in the behaviour of light: measured against moving objects, the speed of light never seemed to vary, no matter how fast or slow the motion of the source from which it was beamed. Most interpreters blamed rogue results. If you release a missile, its speed increases with the force of propulsion; so how could light evade the same variability? Einstein produced a theoretical solution: if the speed of light is constant, he inferred, time and distance must be relative to it. At speeds approaching that of light, time slows, distances shorten. The inference was logical, but it was so counterintuitive and so different from what almost everyone formerly thought that it might have been dodged or dismissed had Poincaré not opened minds up to the possibility of thinking about space and time in fresh ways. Even so, Einstein’s claim was hugely defiant and its success hugely disturbing. He exposed as assumptions what had previously seemed unquestionable truths: the assumption that space and time are absolute had prevailed only because, compared with time, we never go very fast. Einstein’s most graphic example was a paradox he extemporized in reply to a question from the floor at one of his public lectures: a twin who left on an ultra-fast journey would return home younger than the one who stayed behind.

  In Einstein’s universe every appearance deceived. Mass and energy were mutually convertible. Parallel lines met. Notions of order that had prevailed since Newton turned out to be misleading. Commonsense perceptions vanished as if down a rabbit hole to Wonderland. Yet every experiment Einstein’s theory inspired seemed to confirm its validity. According to C. P. Snow, who did as much as anyone to make cutting-edge science universally intelligible, ‘Einstein … sprang into public consciousness … as the symbol of science, the master of the twentieth-century intellect … the spokesman of hope.’11 He transformed the way people perceived reality and measured the universe. For good and ill, he made possible practical research in the conversion of mass into energy. Nuclear power was among the long-term results.12

  Relativity helped, moreover, to unlock new paradoxes. While Einstein reimagined the big picture of the cosmos, other scientists worked on the minutiae that make it up. In work published in 1911, Ernest Rutherford dissected the atom, revealing even tinier particles and exhibiting their dynamism, which earlier atomic explorers had barely suspected: the electrons that seem to slide erratically around a nucleus in patterns impossible to track or predict with the physics of the past. Physicists were already struggling to cope with the apparently dual nature of light: did it consist of waves or particles? The only way to comprehend all the evidence was to admit that it behaved as if it were both. The new discourse of ‘quantum mechanics’ dispelled old notions of coherence. The Danish Nobel Prize winner Niels Bohr described quanta as sharing the apparently self-contradictory nature of light.

  ‌From Relativity to Relativism

  While relativity warped the world-picture, philosophical ma
laise eroded confidence in the traditional framework of every kind of thought: notions about language, reality, and the links between them. The drift to relativism began, however, with a self-subverting doctrine in the service of certainty: pragmatism.

  In everyday language, ‘pragmatism’ just means a practical approach to life. In late-nineteenth-century America, William James elevated practical efficiency to be a criterion not just of usefulness, but of morality and truth. Along with Bergson and Poincaré, he became one of the most widely read intellectuals of the first decade of the twentieth century. From an enterprising grandfather James’s father inherited more wealth than was good for him, dabbling in mysticism and socialism, and taking long naps in London at the Athenaeum, in an armchair of smooth green leather, next to Herbert Spencer’s. Like his father, James was a contemplative and, like his grandfather, a capitalist. He felt guilty when he was not earning his own living. He wanted a distinctively American philosophy, reflecting the values of business and hustle and bustle. The Anglophilia for which his novelist brother, Henry, was famous – or notorious – bothered William. He recommended patriotism, resisted Henry’s attempts to Europeanize him, and always skedaddled back thankfully to ‘my own country’.

  He was a polymath who could not stick to any vocation. He qualified as a physician but recoiled from practice, succumbing to his own sickliness and denouncing medicine as quackery. He achieved renown as a psychologist, while struggling against self-diagnosed symptoms of insanity. He tried painting but bad eyesight obliged him to give it up. He was a workaholic who knew he could only be saved by rest. He advocated ‘tough-minded’ philosophy but flirted with Christian Science, engaged in psychical research, wrote rhapsodical prose, and indulged in spasms of mysticism. He extolled reason and paraded sentiment, but preferred fact. On the rebound from the sublime and ineffable, he turned to the grimy world of Mr Gradgrind, ‘to fact, nothing more’. Pragmatism, which incorporated a lot of his prejudices, including Americanism, practicality, vague religiosity, and deference to facts, was as close as he came to a consistent view of the world. In his bestseller of 1907 he developed and popularized ‘old ways of thinking’ first formulated in the 1870s by Charles Sanders Peirce: philosophy should be useful. Usefulness, said James in effect, makes truth true and rightness right. ‘A pragmatist turns … towards concreteness and adequacy, towards facts, towards action, and towards power.’13 Bergson hailed him for discovering ‘the philosophy of the future’.14

  James never wanted to be a radical. He was looking for reasons to believe in God, arguing that ‘if the hypothesis of God works satisfactorily in the widest sense of the word, it is true’.15 But what works for one individual or group may be useless to others. By reducing truth to conformity with a particular purpose, James abjured what, until then, had been the agreed basis of all knowledge: the assumption that truth and reality match. He set out to vindicate Christianity; in the end he subverted it by relativizing truth.16

  Almost in secret, at first, without publicity, even without publication, linguistics took a similar route away from solid ground onto intellectual quicksand. For those of us who want to tell the truth, language is our attempt to refer to reality. Developments in twentieth-century linguistics, however, seemed to suggest, at least for a while, that the attempt is doomed. In lectures he began in Geneva in January 1907, the year James published Pragmatism, Ferdinand de Saussure shoved linguistics in a new direction. He introduced the distinction between social speech – the parole addressed to others – and subjective language: the langue known only to thought. His character affected the way he communicated. He lectured like Aristotle, notelessly, with an engaging air of spontaneity. His students’ notes are the only surviving record of what he said, leaving experts room to bicker over their accuracy. Generally, his audience understood him to claim that the effect of language arises from the relationships of each term in a text or speech with all other terms. Particular terms have no significance except in combination with each other. What gives language sense are the structures of their relationships, which extend beyond any particular text into the rest of language. Meaning therefore is beyond authorial control. It is never complete, because language is always changing and relationships between terms are always re-forming. Meaning is constructed by culture, not rooted in reality. Readers are autonomous and can re-forge and distort text as they process it between page and memory. It took a long time for Saussure’s thinking to get beyond the classroom into print and pedagogy, but it gradually became linguistic orthodoxy. Most readers reached his work through a series of editorial reconstructions – the scholarly equivalent of Chinese whispers. As we shall see in the next chapter, the message they usually got was that language does not say anything reliable about reality, or about anything except itself.17

  Put this reading of Saussure together with popular interpretations of Poincaré, Bergson, William James, Einstein, and quantum mechanics: there is no fixed space or time; you cannot rely on scientific claims; the basic matter of the universe behaves in unpredictable and inexplicable fashion; truth is relative; and language is divorced from reality. While certainty unravelled, relativism and relativity entwined.

  Science and philosophy between them undermined inherited orthodoxy. Anthropology and psychology, meanwhile, produced equally devastating heresies. The revolution in anthropology spread gradually from America, where Franz Boas started it. This undersung hero of the Western liberal tradition was a German Jew who became the doyen and presiding spirit of anthropology in America. He overturned an assumption in which scientists invested belief, empires effort, and financiers cash: the superior evolutionary status of some peoples and some societies. Like Darwin, he learned from people Westerners dismissed as primitive. But whereas the Fuegians disgusted Darwin, the Inuit inspired Boas. Working among them on Baffin Island in the 1880s, he came to appreciate their practical wisdom and creative imaginations. He turned his insight into a precept for fieldworkers, which also works well as a rule of life: empathy is the heart of understanding. In order to see the intriguing peculiarities of different cultures, anthropologists must strive to share the outlook of people among whom they are imbedded. Determinism of every kind then becomes unappealing and risky generalizations unconvincing, because no single explanation seems adequate to account for the observed divergences.

  Boas was a fieldworker who became a museum curator, always in touch with the people and artefacts he sought to understand. He sent pupils to study Native American peoples along the railway lines that stretched westwards from his classroom in New York. The results proved that there was no such thing as what earlier and contemporary anthropologists called ‘the savage mind’. We all share the same kind of mental equipment, irrespective of the material conditions, technological prowess, social complexity, or sophistication that surround us. Jared Diamond has a neat way of putting it: ‘there are as many geniuses in New Guinea as New York’.18 Boas exposed the fallacies of racist craniology – which alleged that some races had skulls better adapted for intelligence than others. From the biggest, fastest-growing, and most influential national school of anthropologists in the world, he outlawed the notion that peoples could be ranked according to how ‘developed’ their thinking supposedly was. People, he concluded, think differently in different cultures not because some have better brainpower, but because every mind reflects the traditions it inherits, the society that surrounds it, and the environment to which it is exposed. In lectures he gave in 1911, Boas summarized the findings of the research he conducted or supervised:

  The mental attitude of individuals who … develop the beliefs of a tribe is exactly that of the civilized philosopher … The value which we attribute to our own civilization is due to the fact that we participate in this civilization, and that it has been controlling all our actions since the time of our birth; but it is certainly conceivable that there may be other civilizations, based perhaps on different traditions and on a different equilibrium of emotion and reason, which are of no
less value than ours, although it may be impossible for us to appreciate their values without having grown up under their influence … The general theory of valuation of human activities, as developed by anthropological research teaches us a higher tolerance than the one we now profess.19

  That was putting it mildly. It became impossible to make the traditional case for racism or imperialism – that race condemned some people to inescapable inferiority or that empires were custodianships like those of parents over children or guardians over imbeciles. Conversely, Boas made it possible to re-evaluate relationships between cultures. Cultural relativism, as we now call it, became the only credible basis on which a serious study of human societies could be pursued. Some cultures may be better than others, but such a judgement can only be made when the compared cultures share similar values. This is a rarely fulfilled condition. Every culture, says cultural relativism, has to be judged on its own terms.

  Anthropological fieldwork reinforced the relativistic tendency by piling up enormous quantities of diverse data, intractable to the crudely hierarchical schemes of the nineteenth century, but cultural relativism took a while to spread beyond the circles Boas directly influenced. British anthropologists were the first foreigners to absorb his lessons, as early as the first decade of the century. France, where anthropologists commanded the greatest worldwide prestige, soon began to respond positively, and relativism radiated from there. It helped to undermine empires and build multicultural societies, but it threw up intellectual and practical problems that remain unsolved. If no culture is objectively better than another, what happens when their understandings of morality conflict? Can cannibalism, infanticide, widow burning, gender discrimination, headhunting, incest, abortion, female circumcision, and arranged marriage all shelter under the rubric of cultural relativism? How and where does one draw the line?20

 

‹ Prev