The Age of Absurdity: Why Modern Life Makes it Hard to Be Happy (2010)
Page 8
The problem with religions is that the inspirational founders become an embarrassment to the small-minded followers who turn ideas into dogma, principles into regulations and initiatives into ritual. The founders reject kin worship; the followers revere family. The founders go forth; the followers remain at home. The founders are tormented by doubt; the followers bask in certainty. The founders seek authority; the followers seek power. The founders attract and convince; the followers confront and coerce.
Frequently the followers are so successful at distortion that their message becomes exactly the opposite of the original. In the Irish Catholic culture I knew as a boy, the faithful – both clerical and lay – violated the principles of the New Testament so comprehensively and precisely that it almost seemed as though they had read it.
A perfect example was my mother’s repeated and vehement command always to go up to the front at Mass. Respectable people sat at the front, the lower orders in the rows behind – and only the worst kind of corner boy stood at the back. What would she have said if I had showed her Matthew 23:6 with Christ’s denunciation of the Pharisees for loving ‘the chief seats in the synagogues’? She would have become even more angry at this further example of smart-alec cheek. Slave all your life to give your children a higher station in life and what thanks do you get for it? Nothing but smart remarks.
The concept of the quest permeates all culture, religious and secular, early and late, low and high. Many of the greatest literary works of the twentieth century were quest stories. James Joyce’s Ulysses is the story of one ordinary hero’s going forth, trials and tribulations and return home to rebirth. Proust’s huge novel A La Recherche du Temps Perdu is the story of a lifelong quest for meaning, in which the meaning is eventually revealed to be the writing of the story. And Kafka gave the quest saga a modern twist by making the quest always futile and the prize always out of reach: ‘There is a destination but no way there.’120 So K, despite all his efforts, never manages to get into the Castle. Yet he never gives up – and neither do any of Kafka’s other frustrated seekers. In ‘Before the Law’ – a mere page and a half – a man comes from the country seeking admittance to the Law but is barred by a boorish Doorkeeper. The man tries various stratagems – wheedling, bribing, seeking intimacy – but none succeeds. The Doorkeeper remains adamant. Years pass and eventually the seeker, realizing that he is dying, puts one final question: ‘‘Everyone strives to reach the Law,’ says the man, ‘so how does it happen that for all these many years no one but myself has ever begged for admittance?’ The Doorkeeper recognizes that the man has reached his end, and, to let his failing senses catch the words, roars in his ear: ‘No one else could ever be admitted here, since this gate was made only for you. I am now going to shut it.’’121
In fact, many versions of the quest acknowledge that the striving is endless – though not futile. The twelfth-century Sufi poem, The Conference of the Birds by Farid Ud-Din Attar, is like a parable by Kafka. The birds of the world meet for a conference, which turns fractious. A hoopoe rises and, quelling the multitude with natural authority, suggests that what the birds lack is a spiritual leader, a Simorgh, to show them an alternative to aggressive craving. They must all fly off in search of this Simorgh. But many birds are deterred by the prospect of a long and arduous quest. The hawks prefer the power of worldly princes, the herons their desolate shoreline, the ducks their cosy pond. The finches fear for their frailty, the nightingales for their song. But eventually a group sets off, traversing seven valleys – the Valley of the Quest, the Valley of Love, the Valley of Insight into Mystery, the Valley of Detachment, the Valley of Unity, the Valley of Bewilderment and the Valley of Poverty and Nothingness. In each valley they endure dangers, vicissitudes and temptations, and are told stories of exemplary characters. These include Jesus who says, ‘The man who lives and does not strive is lost,’ and Socrates who replies, to the disciples enquiring about where to bury him, ‘If you can find me you are certainly clever for I never found myself.’122
When the birds finally arrive at the court of the Simorgh only thirty remain and they are ageing, exhausted, bedraggled and soiled. A haughty palace herald flies out and, contemptuous of their shabby appearance, tells them they are unworthy and must return whence they came. But the birds demand entry, and are finally admitted. The palace is indeed glorious – but empty, save for mirrors. Around and around they fly, frustrated and heartsick – to have come so far and endured so much for no reward. But, bit by bit, a strange feeling of joy steals over the birds. Suddenly they realize the significance of the mirrors. They have found the Simorgh after all. They are looking at the Simorgh in the mirrors. Because they are the Simorgh (which in Persian also means thirty birds). The Simorgh is them.
This expresses a profound truth – that the search for meaning is itself the meaning, the Way is the destination, the quest is the grail.
Many others have discovered this over the centuries and expressed it in a variety of ways.123 One of the most memorable formulations was by C.P. Cavafy in his poem ‘Ithaca’.
Always keep Ithaca fixed in your mind.
To arrive there is what you are destined for.
But don’t hurry the journey at all.
Better if it lasts for many years,
So you’re old by the time you reach the isle,
Wealthy with all you have gained on the way
And not expecting Ithaca to make you rich.
Ithaca gave you the beautiful journey.
Without her you’d never have set out.
But she has nothing more to give you now.
And if you find her poor, Ithaca won’t have deceived you.
Wise as you have become, after so much experience,
You’ll have understood by then what these Ithacas mean.124
PART III
The Strategies
6
The Undermining of Responsibility
A student fails to submit a project on time and then misses an appointment with his supervisor to discuss the problem. The university sends the student a letter informing him that he has been given a mark of zero for the project. Now the student not only comes to the supervisor but barges into his office without an appointment.
‘This project must be accepted late,’ he demands.
‘Why is that?’
‘Because I’m suffering from TCD.’
‘Which is?’
‘Time-Constraint Disorder – a chemical imbalance in the brain that means I can’t meet deadlines or turn up in time for appointments.’
I invented TCD as a joke, forgetting that it is impossible to satirize the contemporary world, and then discovered that a Professor Joseph Ferrari of DePaul University genuinely wants procrastination recognized as a clinical disorder125 and included in the standard reference work for mental-health professionals, the Diagnostic and Statistical Manual of Mental Disorders (DSM). This tome has already been through four editions, accumulating new disorders in each, with 297 defined in DSM-IV – and many more due in DSM-V. Consider, for example, Antisocial Personality Disorder (APD), which is defined as a ‘pervasive pattern of disregard for and violation of the rights of others that begins in childhood or early adolescence and continues into adulthood’ – in other words the vice formerly known as selfishness. So, the key to indulging a vice is to redefine it as a Disorder and give it a resonant acronym. ‘It’s a condition, ’ you then announce with aggressive outrage if your behaviour is challenged, ‘a Disorder’ Those who spend too much time online will be glad to know that surfing the web has just been identified as a clinical disorder by Dr Jerald Block of the Oregon Health and Science University: ‘Internet addiction appears to be a common disorder that merits inclusion in DSM-V.’126
My own candidate for inclusion in DSM-V is Disorder Addiction Disorder (DAD), an uncontrollable compulsion to classify all undesirable human behaviour as Disorders.
These new ‘Disorders’ are of course welcomed by Big Pharma because sufferers
can be encouraged to buy drugs. But, in a classic example of cultural-conditioning feedback, the pharmaceutical companies also create their own Disorders by redefining previously normal states (a practice known as ‘condition branding’). So Social Anxiety Disorder, the attribute previously known as shyness, is now a ‘condition’ requiring GlaxoSmithKline’s drug Paxil or Pfizer’s Zoloft. Paxil and Zoloft were just two more anti-depressants until their manufacturers launched major campaigns to promote them as cures for Social Anxiety Disorder. Sales immediately soared. A major company may well seize on Time-Constraint Disorder (TCD) and promote one of its poor-selling products as a miracle drug that activates the urgency centres of the brain.
But the Disorder phenomenon is only one consequence of a contemporary desire to evade personal responsibility. No one is prepared to accept blame any more. Instead, everyone wants to be a victim – and frequently succeeds, even in the most unpromising circumstances. When Newham Council in East London pursued Z-Un Noon for non-payment of a series of parking fines, Noon was so outraged that he took the council to court for causing him ‘emotional distress’.127 Better still, he won his case and was awarded £5,000 for the distress caused by each of the four tickets, making a total of £20,000. And when the incredulous council ignored this ruling, bailiffs turned up at the council offices with a ‘notice of seizure’ and began to disconnect and take away computers. Faced with the prospect of total paralysis, the council paid up.
When was the last time anyone said, ‘It’s my fault? Already it seems like centuries since Sartre declared, ‘Man is fully responsible for his nature and his choices.’128 Now the opposite is true. Man is responsible neither for nature nor choices.
How has this come about? The concept of personal responsibility – that we can and should decide our own destinies – is at the heart of modern society and considered axiomatic by most of its citizens. Yet this concept is now being steadily undermined, from both above and below, from both high and low culture – from scientists, philosophers and writers denying free will and from the age of entitlement denying obligation. In science there is the Holy Trinity of Determinism – genetics (behaviour is determined by genes); evolutionary psychology (behaviour is determined by evolved survival mechanisms); and neuroscience (behaviour is determined by the modules of a hard-wired brain). Of course, many scientists have expressed reservations and qualifications, but the subtleties tend to be in the small print – it is easier to remember headlines announcing the discovery of genes for depression, obesity, criminality, homosexuality and, the latest, anxiety129 and male infidelity.130
And here is a contemporary philosopher with no reservations – John Gray, until recently Professor of European Thought at the London School of Economics: ‘There are many reasons for rejecting the idea of free will, some of them decisive. If our actions are caused then we cannot act otherwise than we do. In that case we cannot be responsible for them. We can be free agents only if we are authors of our acts; but we are ourselves products of chance and necessity. We cannot choose to be what we are born. In that case, we cannot be responsible for what we do.’131
Gray also attacks the idea of progress, rejects as illusory the concepts of morality, justice and truth, dismisses any possibility of dealing with the world’s problems and asserts that the world is inexorably bound for tyranny, anarchy, famine, pestilence and the eventual extinction of the human race. This is a contemporary version of the old ‘original sin’ concept in its extreme Manichean form. The human creature is fatally flawed and the world is rushing to inevitable ruin. All that has changed is the nature of the flaw: once it was implanted by God as a punishment; now it is the animal nature inherited from our ancestors. The program in the genes is the new original sin.
This determinism is attractive to many at each end of the social scale. For an authoritarian elite it justifies firm control of the essentially evil human brute and for the individual it justifies self-indulgence because this is inevitable in a fallen creature. Both are absolved of obligation. Attempts to improve either social conditions or personal behaviour would be equally futile.
But has anyone ever argued that behaving well is determined? Has anyone ever protested: ‘Hey, it’s my nature, I just can’t help being good? Determinism is invoked only to excuse behaving badly. No, on second thoughts, I recall reading somewhere about a criminal using genetic determinism as a legal defence. The wise old judge nodded affably: ‘I can quite accept that you are genetically determined to break the law. The problem is that I am genetically determined to uphold it.’ And he smiled apologetically. ‘So I have no choice but to impose on you the maximum sentence.’
John Gray bases his rejection of personal responsibility on the theory that action is unconscious, citing the work of neuroscien-tist Benjamin Libet, who claimed to have discovered that action takes place half a second before the brain makes a conscious decision to act. It is certainly true that much, perhaps even most, of what we do involves no conscious thought. This may even be true of decision-making, where conscious control is assumed to be essential. For several years I taught a course called Decision Theory, which explained various mathematical techniques for weighing the effects of a complex set of factors on an outcome. But gradually there grew the suspicion, which I did not reveal to students or colleagues, that this was merely superstitious mumbo-jumbo, another example of physics envy. And, finally, I slipped into the heresy that, not only did no manager ever use these techniques, but that the business of decision-making was barely rational at all. This was confirmed by a rare experience of decisionmaking in practice. As a teacher of database theory I was co-opted on to a team with the responsibility of choosing a new Database Management System, which would be used on all database teaching and for the university’s own information systems. There were three major database contenders and the team went to each corporation, sat through lengthy demonstrations and asked probing questions. But, in the end, without anyone publicly admitting it, we were exhausted by technical detail and opted for the presenters we liked best. In fact, their database became the market leader whereas the other two died. Hence, a useful technique – assess the vendor not the product.
This emotional basis for decision-making has also been demonstrated by Antonio Damasio, who discovered in the 1990s that some brain-damaged patients could no longer feel emotion, though their intelligence and ability to apply reason and logic were unimpaired.132 Delivered from the maelstrom of emotion, these people should have been able to make lucid, rational decisions based on a logical analysis of choices. In fact, it was just the opposite. They were unable to make any decisions, even the most simple. They could analyse the pros and cons of each possibility but, without feeling, they were unable to choose one over the others. So intuition or ‘gut feeling’ is not merely a part of the process but an essential feature of it.
Building on Damasio’s discovery, Joseph LeDoux proposed that the brain has two routes to decision, the low road’ and the ‘high road’.133 The low road involves no conscious reasoning or awareness and processes sensory data in the amygdala, the brain’s emotion centre. This route to action is instantaneous, overwhelmingly powerful and immensely difficult to control – and it is the route supporting the Gray⁄Libet theory. But there is also a high road to action via the prefrontal cortex, the centre for analysing, planning and conscious decision-making. This centre is connected directly to the amygdala so there is always an emotional input to the reasoning, as Damasio realized. But, according to LeDoux, the prefrontal cortex can – and frequently does – override the amygdala’s primitive desires and drives. And awareness of the emotional brain increases the power of the prefrontal cortex.
Damasio makes the same point: ‘We can be wise to the fact that our brain still carries the machinery to react in the way it did in a very different context ages ago. And we can learn to disregard such reactions and persuade others to do the same.’134 Unusually for a scientist, Damasio has a thrillingly specific suggestion for the exercise of free
will: ‘We humans…can wilfully strive to control our emotions. We can decide which objects and situations we allow in our environment and on which objects and situations we lavish time and attention. We can, for example, decide not to watch commercial television, and advocate its eternal banishment from the households of intelligent citizens.’135
So the neuroscientific view of human behaviour is entirely consistent with the Buddha⁄Spinoza⁄Freud model of the self and Sartre’s insistence on personal responsibility and choice.
Neuroscience even rescues us from the tyranny of the genes. Matt Ridley, the genetics writer and author, states: ‘By far the most important discovery of recent years in brain science is that genes are at the mercy of actions as well as vice versa…They are cogs responding to experience as mediated through the senses. Their promoters are designed to be switched off and on by events.’136
Ridley’s conclusion is magnificently unequivocal: ‘Free will is entirely compatible with a brain exquisitely prespecified by, and run by, genes.’ And, en route, he dispels the idea of genetics as an evil science endorsing selfishness, ruthlessness and brute strength. For instance, there is a question that may have vexed visitors to the zoo: if chimpanzees are only a fraction of the size of gorillas, how come their testicles are sixteen times as large? Since male gorillas have harems, which they need to protect, they have developed impressive size and fearsome appearance – but they do not need prodigious equipment to fertilize since they have no competition. In other words, they need only appear to have balls. Chimp females, however, are promiscuous, so the males who ejaculate frequently and copiously are more likely to have offspring. Male chimps really do need big balls. Which is better – to look big and ferocious but have modest testicles and limited orgasms, or to be small and unintimidating but have huge balls and come like an exploding supernova? Nature seems to be teaching the same lesson as Stoic philosophers – that the little guy unconcerned with appearances has lots more fun.