Book Read Free

The Age of Absurdity: Why Modern Life Makes it Hard to Be Happy (2010)

Page 9

by Michael Foley


  As for evolutionary psychology (EP), the theory is that behaviour is determined by the human brain evolving in certain ways as a result of natural selection during the Pleistocene period, with no development since because there has not been enough time. EP explanations of behaviour can seem dauntingly scientific but often the reasoning is dubious and the evidence thin or nonexistent. This is a self-validating theory like that of divine will. If everything that happens is planned by God, then the task of establishing meaning is to provide plausible divine intentions – and these, in turn, validate the original theory. Similarly, if everything we think, feel and do is the result of a survival adaptation, then the task is to suggest plausible adaptation stories – and this requires only imagination since there is little evidence of what went on in the Pleistocene period. For instance, I could argue that imagination itself evolved because ability to con the gullible greatly improved survival prospects.

  There is a Steve on each side of the determinism debate – so both sides can say, ‘Our Steve is smarter than your Steve’. The psychologist and determinist Steven Pinker has argued that, as a consequence of evolution in the African savannah, humans have a universal preference for art depicting green landscapes and water.137 But I can supply evidence disproving the ‘universal preference’ – Duke Ellington hated grass because it reminded him of graves.

  The biologist Steven Rose suggests that the preference for greenery, if indeed it exists, is as likely to be due to the pastoral nostalgia of urban societies. I can also propose evidence for this theory – the hunger for landscape art is overwhelmingly strong in southern England, which also happens to be nearly all motorways and concrete.

  Rose argues that living organisms are not mere passive vehicles separating genes and environment. ‘Rather, organisms actively engage in constructing their environments, constantly choosing, absorbing and transforming the world around them. Every living creature is in constant flux, always at the same time both being and becoming. ‘158

  Neuroscientists have also challenged the ‘hard-wired brain’ theory, suggesting instead that the human brain is extraordinarily plastic. Far from being fixed millions of years ago, the individual brain constantly rewires itself throughout a lifetime in response to experience. It is true that, broadly speaking, specific functions are carried out by specific parts of the brain, but the detailed processing is likely to be different in each brain. And, if a functional area is damaged, the brain may be able to rewire itself to process the function in a different way. More importantly for everyday life, almost any form of persistent, attentive activity produces new brain configurations. The neurons that fire together wire together. Musicians who play stringed instruments have larger brain maps for the left hand, taxi drivers have larger hippocampi (the area that stores spatial information), experienced meditators have bigger and thicker prefrontal cortices (the area responsible for attention and concentration). The bad news is that less desirable activities – anxieties, obsessions, compulsions, addictions, bad habits – also develop their own dedicated brain networks, which become efficient and self-sustaining and difficult to change.139

  So, the ‘chemical imbalance in the brain’ form of determinism, the supposed cause of ‘disorders’, may be a confusion of cause and effect. If certain chemical brain states correlate with behaviour, it may be the behaviour that has produced the states rather than the other way round. For instance, there is a high correlation between attention disorders and television viewing in early childhood.140

  There is no justification for the old excuse, ‘This is just how I am.’ Individual temperament (formed by a combination of genetic inheritance, family influences and cultural factors) certainly encourages attitudes, behaviours and moods and is extremely hard to override, never mind permanently change. This is what psychology defines as the ‘set point’. But, as well as temperament, there is character. Temperament is what you are – but character is what you do. Temperament is a given; character may be forged. We can choose to oppose the dictates of temperament and, if we act differently in a certain way for long enough, the new behaviour will establish its own brain connections. As Hamlet says to his weak mother: ‘Use, almost, can change the stamp of nature.’141 That ‘almost’ is the touch of genius. Shakespeare seems to have understood even the nature versus nurture debate – and, as always, avoided taking sides.

  The word ‘character’, however, has a fatally old-fashioned ring. The age of entitlement does not seek character, which demands obligation, but identity, which demands rights. Identity can be sought in money, status or celebrity but is most easily conferred by belonging to a group – usually based on ethnicity, race, religion or sexual orientation. The group will be especially attractive if it can claim to have suffered injustice. Then its members can be victims and enjoy the luxury of having someone else to blame.

  And blame is the new solution to the contemporary inability to accept random bad luck. Once misfortune was explained as the mysterious ways of God – the suffering had a purpose, which would be revealed in the fullness of time. Now, what makes misfortune meaningful is culpability. Someone must be to blame and it is never the victim. Shit happens – but it is always some other shit’s fault. Just as the pharmaceutical industry is happy to cash in on blaming disorders, the legal profession is more than willing to be paid for blaming the other shit. The medical profession is also willing to oblige. The British Medical Journal recently initiated an extraordinary project aimed at removing the word ‘accident’ from the English language. ‘Purging a common term from our lexicon will not be easy’, conceded the august journal. Nevertheless, ‘The BMJ has decided to ban the word accident.’142

  This inability to accept randomness is what makes conspiracy theories so attractive. Such theories invest the banal and random with glamour and significance, and put the blame for personal irresponsibility on secret, sinister forces. It is so much more satisfactory to believe that Diana, Princess of Wales, was murdered in a car crash engineered by the Duke of Edinburgh and that Marilyn Monroe was killed by a poisoned suppository inserted on the orders of Robert Kennedy. The pitiful truth would expose too much personal irresponsibility – that one was killed by a drunk driver and the other by alcohol and drug abuse.

  The problem with shifting blame is that now no one is prepared to accept blame. Here is a twenty-first century story. A 37-year-old man, Gary Hart, divorced from his first wife and separated from his second, meets a woman called Kristeen Panter in an internet chat room and stays up until 5 a.m. talking to her online. Then he sets off on a 145-mile journey driving a Land Rover with a trailer. But he falls asleep at the wheel and runs off the road down an embankment into the path of a train, causing a crash that kills ten people and injures seventy-six more. Hart (‘My life is 1,000 miles per hour – it’s just the way I live’143) is charged with dangerous driving but, at the trial, denies falling asleep and claims that the problem was mechanical failure, though a meticulous reconstruction of the Land Rover has revealed no problems. Hart is convicted and given a five-year prison sentence. When he gets out of prison he appears on a television documentary about the incident, denies any responsibility and claims that he, too, is a victim. Invited to consider photographs of the carnage, Hart does express sadness – but for the mangled wreckage of his Land Rover: ‘I loved that old truck.’144

  Of course, this story is only one extreme example – but it would be difficult to imagine an attitude like Hart’s in an earlier era.

  Parallel to the refusals of responsibility are the claims to deserve. Everyone now deserves a holiday (meaning not just a break but a trip abroad to a desirable location); students invariably deserve higher grades (regardless of assessment criteria, the argument is always, ‘but I spent x hours on this’); employees deserve promotion (even when they meet none of the requirements for the new level); artistes deserve more recognition (everything written deserves to be published, everything painted deserves to be exhibited, every performer deserves a stage); lovers des
erve a dream partner next time (not despite but because of ‘all the past failures they themselves probably caused but for which they accept no responsibility). Failure is an obsolete concept. No one is willing to accept that few are worthy of high grades or artistic recognition and that there is no such thing as a dream partner. So failure is the new taboo F-word. In an initiative comparable to that of the British Medical Journal, my own university has come up with an imaginative solution. Students achieving less than 40 per cent in a module are recorded as having not taken it – which not only avoids the F-word but suggests that the embarrassing lapse never happened at all.

  This sense of deserving has surely been a factor in the growth of debt. The development of entitlement since the 1970s coincides exactly with a steady rise in personal debt. If you are entitled to a certain lifestyle then borrowing the money to fund it is simply claiming what is rightfully yours – and there is no obligation to pay it back. So the lender attempting to recover money is an ugly bully harassing an innocent victim. Attitudes to debt are a great example of how cultural conditioning can change: not so long ago debt was a sin, then an unpleasant necessity for buying a home, then the way to fund a deserved lifestyle and finally something so obviously good that only a fool would refuse it. At this stage the debt house of cards became so ridiculously huge that the removal of one card was almost enough to destroy the world’s financial systems. And, of course, everyone blamed the bankers for the disastrous consequences. Drag out the bankers and hang them!

  The problem with an overwhelming sense of entitlement is that it promises satisfaction but usually delivers its opposite. Entitlement encourages all three of Albert Ellis’s disastrous ‘musts’ – ’ I must succeed’, ‘Everyone must treat me well’, ‘The world must be easy’. And when none of these happens, the conclusion is not that the demands were unjustified but that malign, powerful, hidden forces are denying them. So the sense of entitlement becomes a sense of bitter grievance.

  Another consequence of entitlement is the contemporary worship of ‘diversity’ and the, often concomitant, belief that the demands of all groups are equally valid. The problem is that there are two types of diversity – diversity of opportunity, which is a question of rights, and diversity of ethics, which is a question of values – and the necessity of recognizing the first has led to unthinking acceptance of the second. Demanding justice for minorities who have suffered discrimination on the basis of race, ethnicity, gender or sexual orientation is entirely valid. But ethical diversity is a contradiction in terms. If the values of others are valid, then one’s own must be equally arbitrary and therefore without value. The inevitable consequence of this relativism is a fatal loss of nerve – it becomes impossible to uphold values and make value judgements. On contentious issues we murmur that there is much to be said on both sides. About political conflicts we say, ‘One side is as bad as the other’, and about politicians, ‘One is as bad as the other’. We see people making foolish decisions that will inevitably lead to disaster but we say nothing to them; to ourselves we say, ‘I have no right to intervene’, ‘The advice would be rejected’, ‘It would only cause divisiveness’, and ‘What do I know about anything anyway?’

  This leads to abdication of authority and the bizarre but common reversals of children bullying their parents, students assessing their teachers and employees exploiting their bosses.

  And, in the absence of values and principles, ethics becomes merely legalism, restricted to situations and transactions, a matter of resolving dilemmas and drawing up contracts – I agree to do this if you agree to do that.

  Another problem with ‘celebrating diversity’ is that it aims to promote inclusiveness but often promotes its opposite, separatism. The groups who feel deprived of rights blame other groups and demand to be separated from them. If the group is ethnically or religiously based it will demand its own country. And, if not its very own country, then at least a substantial chunk of some other group’s country. But separatism, rather than easing divisions, reinforces and exacerbates them. Sartre described the ugly consequences of Us-and-Them consciousness, and psychology experiments have demonstrated that even artificial and arbitrary separation can cause conflict. In fact, the resulting conflicts were so serious that this type of experiment is now considered too dangerous. One of the last was undertaken in 1966 by Muzafer Sherif on a group of eleven and twelve year olds living harmoniously in a large cabin on a holiday camp. Sherif divided this group into two, with pairs of friends deliberately split up, and put each of the new groups in a separate cabin. Soon there was tension between the cabins, with taunting and insults becoming common, and even former friends coming to hate each other. Over time aggressive leaders emerged in each cabin.145 So an entirely arbitrary separation produced a division, which became increasingly bitter. The lesson is that separatism causes the very problems it is supposed to prevent, which is then used as evidence for the bigotry that motivated the separation in the first place, so making the separatism even more strident.

  But perhaps the worst consequence of entitlement is a sense of grievance – which encourages the human tendency to whinge. To my knowledge no major thinker has ever recommended or endorsed whingeing. Philosophy from the Stoics to the existentialists rings with denunciations of complaint. Has anyone ever become happier by whingeing?

  There is often a temptation to think that one could be happier if only responsibility could be evaded or transferred to someone else, which explains the growing numbers of consultants, advisors, instructors, gurus, therapists, counsellors, personal trainers and, the inevitable development, life coaches. In Don DeLillo’s satirical novel White Noise, the narrator’s wife teaches an adult course on ‘Standing, Sitting and Walking’, which is such a success that she is asked to develop another course on ‘Eating and Drinking’. When the narrator suggests that this might involve labouring the obvious, she explains that people need to be reassured by someone in a position of authority.146

  Being instructed may seem a luxury, but philosophers and psychologists agree that only personal responsibility brings fulfilment. This was demonstrated by a famous study in which elderly residents on two floors of a care home were given plants for their rooms. On one floor residents were permitted to choose and water plants; on the other floor the plants were distributed and maintained by staff. On the floor with control the residents became happier, more active and alert and required less medication. And similar results were observed in other studies involving choice of films and timing of visits from volunteers. Conversely, loss of control caused unhappiness and depression. (But would it have been different if they had been card-carrying stoics? Is awareness and acceptance of lack of control itself a form of control?) Even more surprisingly, it was discovered in the six-month follow-up to the plant study that twice as many no-control as control residents had died (30 per cent compared to 15 per cent). So personal responsibility may be a matter of life and death.147

  The less personal responsibility is exercised, the greater the likelihood of conformity. A series of classic experiments on conforming was conducted in 1955 by the psychologist Solomon Asch. Volunteers were required to perform a simple matching test. When left alone, they got it right 99 per cent of the time. But when assigned to a group (all the experimenter’s accomplices except for the single volunteer) that every so often gave a unanimous wrong answer, volunteers agreed with the incorrect group answer 70 per cent of the time. And, when informed of the deception afterwards and invited to estimate the extent of their conformity, all the volunteers underestimated it.148

  The interesting question, of course, is the mental process of conforming – how do people convince themselves to accept things they would otherwise reject as wrong? When the Asch experiments were repeated recently using brain imaging on the volunteers, the incorrect group-influenced judgements caused changes in the brain areas dedicated to vision and spatial awareness but no changes in the areas for monitoring and resolving conflict. So the alarming conclusion was
that no self-convincing seemed to be needed – the volunteers actually saw what the group only claimed to see. As Gregory Berns, the neuroscientist who conducted the new research, concluded, ‘We like to think that seeing is believing, but the study’s findings show that seeing is believing what the group tells you to believe.’149 As for independent judgements consciously disagreeing with those of the group, these caused activity in the brain area associated with emotion, suggesting that autonomy and opposition are stressful. This stress was shown to be justified in other experiments simulating jury discussions where a minority opposed a majority verdict, the scenario in the film 12 Angry Men. The minority view prevailed if it was expressed consistently, confidently and undog-matically – but no one liked the minority people. This is evidence of the crank effect: promoting principle and truth may eventually be effective, but the promoters will be dismissed as cranks.

  Even more shocking were the Milgram experiments on obedience where, at the behest of a grave authoritarian figure in a laboratory coat, volunteers administered what they believed to be electric shocks of increasing intensity to people (middle-aged and mild-mannered) who answered questions incorrectly. Before the experiment Stanley Milgram invited forty psychiatrists to estimate the level of volunteer compliance. Their view was that only 1 per cent of sadists would continue to the maximum shock level. In fact 65 per cent went all the way to 450 volts, despite hearing what they believed were appeals to desist and even screams of pain. And, if the volunteers were permitted to delegate the actual operation of the shock lever to someone else, the compliance level rose to 90 per cent. The only good news was that the level could be reduced to 10 per cent if volunteers saw someone else refusing to administer shocks.150 These variations demonstrate once again the power of example, good and bad.

 

‹ Prev