Book Read Free

The Idiot Brain

Page 9

by Dean Burnett


  Crucially, this also means that many of our earliest memories are formed in a world that is seemingly organised and controlled by powerful figures who are hard to understand (rather than a world that is just random or chaotic). Such notions can be deeply entrenched, and that belief system can be carried into adulthood. It is more comforting for some adults to believe that the world is organised according to the plans of powerful authority figures, be they wealthy tycoons, alien lizards with a penchant for human flesh, or scientists.

  The previous paragraph may suggest that people who believe in conspiracy theories are insecure, immature individuals, subconsciously yearning for parental approval that was never forthcoming as they grew up. And no doubt some of them are, but then so are countless people who aren’t into conspiracy theories; I’m not going to ramble on for several paragraphs about the risks of making ill-founded connections between two unrelated things and then do exactly that myself. What’s been said is just a way of suggesting means by which the development of the brain may make conspiracy theories more ‘plausible’.

  But one prominent consequence (or it might be a cause) of our tendency to look for patterns is that the brain really doesn’t handle randomness well. The brain seems to struggle with the idea that something can happen for no discernible reason other than chance. It might be yet another consequence of our brains seeking danger everywhere – if there’s no real cause for an occurrence then there’s nothing that can be done about it if it ends up being dangerous, and that’s not tolerable. Or it might be something else entirely. Maybe the brain’s opposition to anything random is just a chance mutation that proved useful. That would be a cruel irony, if nothing else.

  Whatever the cause, the rejection of randomness has numerous knock-on consequences, one of which is the reflex assumption that everything that happens does so for a reason, often referred to as ‘fate’. In reality, some people are just unfortunate, but that’s not an acceptable explanation for the brain, so it has to find one and attach a flimsy rationale. Having a lot of bad luck? Must be that mirror you broke, which contained your soul, which is now fractured. Or maybe it’s that you’re being visited by mischievous fairies; they hate iron, so keep a horseshoe around, that’ll keep them away.

  You could argue that conspiracy theorists are convinced that sinister organisations are running the world because that’s better than the alternative! The idea that all of human society is just bumbling along due to haphazard occurrences and luck is, in many ways, more distressing than there being a shadowy elite running things, even if it is for its own ends. Better a drunk pilot at the controls than nobody at all.

  In personality studies, this concept is called the ‘pronounced locus of control’ and refers to the extent to which individuals believe they can control the events affecting them.6 The bigger your locus of control, the more ‘in control’ you believe you are (the extent to which you really are in control of events is irrelevant). Exactly why some people feel more in control than others is a poorly understood area; some studies have linked an enlarged hippocampus to a greater locus of control,7 but the stress hormone cortisol can apparently shrink the hippocampus, and people who feel less in control tend to be more easily stressed, so the hippocampus size may be a consequence rather than a cause of the locus of control.8 The brain never makes anything easy for us.

  Anyway, a greater locus of control means you may end up feeling you can influence the cause of these occurrences (a cause which doesn’t actually exist, but no matter). If it’s superstition, you throw salt over your shoulder or touch wood or avoid ladders and black cats, and are thus reassured that your actions have prevented catastrophe via means that defy all rational explanation.

  Individuals with an even greater locus of control try to undermine the ‘conspiracy’ they see by spreading awareness of it, looking ‘deeper’ into the details (reliability of the source is rarely a concern) and pointing them out to anyone who’ll listen, and declaring all those who don’t to be ‘mindless sheep’ or some variation thereof. Superstitions tend to be more passive; people can just adhere to them and go about their day as normal. Conspiracy theories tend to involve a lot more dedication and effort. When was the last time someone tried to convince you of the hidden truth behind why rabbit’s feet are lucky?

  Overall, it seems the brain’s love of patterns and hatred of randomness leads many people to make some pretty extreme conclusions. This wouldn’t really be an issue, but the brain also makes it very hard to convince someone that their deeply held views and conclusions are wrong, no matter how much evidence you have. The superstitious and the conspiracy theorists maintain their bizarre beliefs despite everything the rational world throws at them. And it’s all thanks to our idiot brains.

  Or is it? Everything I’ve said here is based on the current understanding provided by neuroscience and psychology, but then that understanding is rather limited. The very subject matter alone is so hard to pin down. What is a superstition, in the psychological sense? What would one look like in the terms of brain activity? Is it a belief ? An idea? We might have advanced to the point where we can scan for activity in the working brain, but just because we can see activity doesn’t mean we understand what it represents, any more than being able to see a piano’s keys means we can play Mozart.

  Not that scientists haven’t tried. For example, Marjaana Lindeman and colleagues performed fMRI scans of twelve self-described supernatural believers and eleven sceptics.9 The subjects were told to imagine a critical life situation (such as imminent job loss or relationship breakdown) and were then shown ‘emotionally charged pictures of lifeless objects and scenery (for example, two red cherries bound together)’ – the sort of thing you’d see on motivational posters, like a spectacular mountain top, that sort of thing. Supernatural believers reported seeing hints and signs of how their personal situation would resolve in the image; if imagining a relationship breakdown, they would feel it would be all right because the two cherries bound together signified firm ties and commitment. The sceptics, as you’d expect, didn’t do this.

  The interesting element of this study is that viewing the pictures activated the left inferior temporal gyrus in all subjects, a region associated with image processing. In the supernatural believers, much less activity was seen in the right inferior temporal gyrus when compared with the sceptics. This region has been associated with cognitive inhibition, meaning it modulates and reduces other cognitive processes.10 In this case, it may be suppressing the activity that leads to forming illogical patterns and connections, which would explain why some people are quick to believe in irrational or unlikely occurrences while others require serious convincing; if the right inferior temporal gyrus is weak, the more irrational-leaning processes in the brain exert more influence.

  This is far from a conclusive experiment though, for many reasons. For one, it’s a very small number of subjects, but, mainly, how does one measure or determine one’s ‘supernatural leanings’? This isn’t something covered by the metric system. Some people like to believe they’re totally rational, but this itself may be an ironic self-delusion.

  It’s even worse studying conspiracy theories. The same rules apply, but it’s harder to get willing subjects, given the subject matter. Conspiracy theorists tend to be secretive, paranoid and distrustful of recognised authorities, so if a scientist were to say to one, ‘Would you like to come to our secure facility and let us experiment on you? It may involve being confined in a metal tube so we can scan your brain’, the answer is unlikely to be yes. So all that’s included in this section is a reasonable set of theories and assumptions based on the data we currently have available.

  But then, I would say that, wouldn’t I? This whole chapter could be part of the conspiracy to keep people in the dark …

  Some people would rather wrestle a wildcat than sing karaoke

  (Phobias, social anxieties and their numerous manifestations)

  Karaoke is a globally popular pastime. Some people l
ove getting up in front of (usually quite intoxicated) strangers and singing a song that they’re often only vaguely familiar with, regardless of their singing ability. There haven’t been experiments on this but I’d posit there is an inverse relationship between enthusiasm and ability. Consumption of alcohol is almost certainly a factor in this trend. And in these days of the televised talent contest, people can sing in front of millions of strangers rather than a small crowd of uninterested drunks.

  To some of us, this is a terrifying prospect. The stuff nightmares are made of, in fact. You ask certain people if they want to get up and sing in front of a crowd and they’ll react as if you’ve just told them they’ve got to juggle live grenades in the nude while all their ex-partners are watching. The colour will drain from their faces, they’ll tense up, start breathing rapidly, and exhibit many other classic indicators of the fight-or-flight response. Given the choice between singing and taking part in combat, they’ll happily engage in a fight to the death (unless there’s an audience for that, too).

  What’s going on there? Whatever you think of karaoke, it’s risk free, unless the crowd is made up of steroid-abusing music lovers. Sure, it can go badly; you might mangle a tune so awfully that everyone listening ends up begging for the sweet relief of death. But so what? So a few people you’ll never meet again consider your singing abilities to be below par. Where’s the harm in that? But as far as our brains are concerned, there is harm in that. Shame, embarrassment, public humiliation; these are all intense negative sensations that nobody but the most dedicated deviant actively seeks out. The mere possibility of any (or all) of these occurring is enough to put people off most things.

  There are many things people are afraid of that are far more mundane than karaoke: talking on the telephone (something I myself avoid wherever possible), paying for something with a queue behind you, remembering a round of drinks, giving presentations, getting a haircut – things millions of people do every day without incident but that still fill some people with dread and panic.

  These are social anxieties. Practically everyone has them to some extent, but if they get to the point where they are actually disruptive and debilitating to a person’s functioning, they can be classed as a social phobia. Social phobias are the most common of several manifestations of phobias, so to understand the underlying neuroscience let’s step back a bit and look at phobias in general.

  A phobia is an irrational fear of something. If a spider lands on your hand unexpectedly and you yelp and flail a bit, people would understand; a creepy-crawly surprised you, people don’t like insects touching them, so your reaction is justifiable. If a spider lands on your hand and you scream uncontrollably while knocking tables over before scrubbing your hand in bleach, burning all your clothes then refusing to leave your house for a month, then this may be considered ‘irrational’. It’s just a spider, after all.

  An interesting thing about phobias is that people who have them are usually completely aware of how illogical they are.11 People with arachnophobia know, on a conscious level, that a spider no bigger than a penny poses no danger to them, but they can’t help their excessive fear reaction. This is why the stock phrases used in response to someone’s phobia (‘It won’t hurt you’) are well meant but utterly pointless. Knowing that something isn’t dangerous doesn’t make much difference, so the fear we associate with the trigger obviously goes deeper than the conscious level, which is why phobias can be so tricky and persistent.

  Phobias can be classed as specific (or ‘simple’) or complex. Both of these labels refer to the source of the phobia. Simple phobias apply to phobias of a certain object (for example, knives), animal (spiders, rats), situation (being in a lift) or thing (blood, vomiting). As long as the individual avoids these things, they’re able to go about their business. Sometimes it’s impossible to avoid the triggers completely, but they’re usually transient; you might be scared of lifts, but a typical lift journey lasts seconds, unless you’re Willy Wonka.

  There are a variety of reasons for exactly how these phobias originate. At the most fundamental level, we have associative learning, attaching a specific response (such as a fear reaction) to a specific stimulus (such as a spider). Even the most neurologically uncomplicated creatures seem capable of it, such as Aplysia, aka the California sea slug, a very simple metre-long aquatic gastropod that was used in the 1970s in the earliest experiments to monitor neuronal changes occurring in learning.12 They may be simple and have a rudimentary nervous system by human standards, but they can show associative learning and, more importantly, have massive neurons, big enough to stick electrodes in to record what’s going on. Aplysia neurons can have axons (the long ‘trunk’ part of a neuron) up to a millimetre in diameter. This might not sound like much, but it’s comparatively vast. If human neuron axons were the size of a drinking straw, Aplysia axons would be the size of the Channel Tunnel.

  Big neurons wouldn’t be of any use if the creatures couldn’t show associative learning, which is the point here. We’ve hinted at this before; in the section on diet and appetite in Chapter 1, it was observed how the brain can make the cake–illness association and you feel sick just thinking about it. The same mechanism can apply to phobias and fears.

  If you get warned against something (meeting strangers, electrical wiring, rats, germs), your brain is going to extrapolate all the bad things that could happen if you encounter it. Then you do encounter it, and your brain activates all these ‘likely’ scenarios, and activates the fight-or-flight response. The amygdala, responsible for encoding the fear component of memory, attaches a danger label to memories of the encounter. So, the next time you encounter this thing, you’ll remember danger, and have the same reaction. When we learn to be wary of something we end up fearing it. In some people, this can end up as a phobia.

  This process implies that literally anything can become the focus of a phobia, and if you’ve ever seen a list of existing phobias this seems to be the case. Notable examples include turophobia (fear of cheese), xanthophobia (fear of the colour yellow, which has obvious overlaps with turophobia), hippopotomonstrosesquipedaliophobia (fear of long words, because psychologists are basically evil) and phobophobia (fear of having a phobia, because the brain regularly turns to the concept of logic and says, ‘Shut up, you’re not my real dad!’). However, some phobias are considerably more common than others, suggesting that there are other factors at play.

  We have evolved to fear certain things. One behavioural study taught chimps to be afraid of snakes. This is relatively straightforward task, usually involving showing them a snake and following this with an unpleasant sensation, like a mild electric shock or unpleasant food, just something they want to avoid if possible. The interesting part is that when other chimps saw them react fearfully to snakes, they quickly learned to fear snakes too, without having been trained.13 This is often described as ‘social learning’.*

  Social learning and cues are incredibly powerful, and the brain’s ‘better safe than sorry’ approach when it comes to dangers means if we see someone being afraid of something, there’s a good chance we’ll be afraid of it too. This is especially true during childhood, where our understanding of the world is still developing, largely via the input of others who we assume know more than we do. So if our parents have a particularly strong phobia, there’s a good chance we’ll end up with it, like a particularly unsettling hand-me-down. It makes sense: if a child sees a parent, or their primary educator/teacher/provider/role model, start shrieking and flapping because they’ve seen a mouse, this is bound to be a vivid and unsettling experience, one that makes and impression on a young mind.

  The brain’s fear response means phobias are hard to get rid of. Most learned associations can be removed eventually via a process established in Pavlov’s famous dogs experiment. A bell was associated with food, prompting a learned response (salivation) whenever it was heard, but if the bell was then rung repeatedly in the continued absence of food, eventually the associatio
n faded. This same procedure can used in many contexts, and is known as extinction (not to be confused with what happened to the dinosaurs).16 The brain learns that the stimulus such as the bell isn’t associated with anything and therefore doesn’t require a specific response.

  You’d think that phobias would be subject to a similar process, given how almost every encounter with their cause results in no harm whatsoever. But here’s the tricky part: the fear response triggered by the phobia justifies it. In a masterpiece of circular logic, the brain decides that something is dangerous, and as a result it sets off the fight-or-flight response when it encounters it. This causes all the usual physical reactions, flooding our systems with adrenalin, making us tense and panicked and so on. The fight-or-flight response is biologically demanding and draining and often unpleasant to experience, so the brain remembers this as ‘The last time I met that thing, the body went haywire, so I was right; it is dangerous!’ and thus the phobia is reinforced, not diminished, regardless of how little actual harm the individual came to.

  The nature of the phobia also plays a part. Thus far we’ve described the simple phobias (phobias triggered by specific things or objects, having an easily identified and avoidable source), but there are also complex ones (phobias triggered by more complicated things such as contexts or situations). Agoraphobia is a type of complex phobia, generally misunderstood as fear of open spaces. More precisely, agoraphobia is a fear of being in a situation where escape would be impossible or help would be absent.17 Technically, this can be anywhere outside the person’s home, hence severe agoraphobia prevents people from leaving the house, leading to the ‘fear of open spaces’ misconception.

  Agoraphobia is strongly associated with panic disorder. Panic attacks can happen to anyone – the fear response overwhelms us and we can’t do anything about it and we feel distressed/terrified/can’t breathe/sick/head spins/trapped. The symptoms vary from person to person, and an interesting article by Lindsey Homes and Alissa Scheller for the Huffington Post in 2014 entitled ‘This is what a panic attack feels like’ collected some personal descriptions from sufferers, one of which was: ‘Mine are like I can’t stand up, I can’t speak. All I feel is an intense amount of pain all over, like something is just squeezing me into this little ball. If it is really bad I can’t breathe, I start to hyperventilate and I throw up.’

 

‹ Prev