After each answer, she immediately told her volunteers what the real chances were of such an event happening. So, if someone thought they had a 10 per cent chance of developing cancer, she would reveal that the real probability was 30 per cent – or quite a lot worse.
What Sharot discovered was that when her subjects were given bad news, news that should have led them to be more concerned than they were previously, the part of their brains that should have fixed the mismatch between their prediction and the true chance of disaster showed only low-level activation.
However, when a subject was given information that was better than expected – for example, if someone thought they had a 50 per cent of being burgled, but was then told that their real chance of being burgled was only 30 per cent – the part of their brain that processed the information went wild. What’s more, when the volunteers were asked to go through the list of unpleasant events again, they actively changed their beliefs when the information they had been given improved their prospects. For example, if they found out that they were actually less likely to suffer from infertility than they had thought, they adjusted to the new risk percentages presented. But if the new information pointed to there being an even higher chance of something bad happening to them, they simply ignored it.
When it comes to things that affect us directly, it seems that many of us dismiss information that suggests that bad things will happen to us, and only pay attention to the good stuff. It doesn’t matter what we throw at them, unconscious processes in our brains are determined to show us a rosy glow.
There are obvious dangers to this when it comes to making decisions. If your unconscious belief is that you won’t get lung cancer from smoking, then you’re unlikely to choose to quit. For every warning from an anti-smoking campaigner, your brain will be giving a lot more weight to that story of the ninety-nine-year-old lady who smokes fifty cigarettes a day but is still going strong. You’re not doing this consciously, but it is happening.
Similarly, if you’re a trader buying and selling stocks and shares, or an investor looking to buy another property, you’ll be paying more attention to evidence of sustained growth and stories of rising prices than to the nay-sayers who predict a crash – a partial explanation for financial and housing bubbles.30
The inability to properly process news that suggests something bad may happen to us is clearly a dangerous trait for nearly all decision-makers – not just traders or smokers or property speculators. Dr Sharot’s research reveals that 80 per cent of us are very vulnerable to this mental lapse.31 Interestingly, however, there is one group of people who it turns out update their beliefs in a more balanced way: people with mild depression appear to be better at balancing the good and the bad when they receive information.
If you’re not depressed yourself, however, don’t despair. Being aware that you’re prone to this thinking error is a start – it means you can challenge your immediate reactions and reflect upon how your decisions would be affected if your optimism were overstated. You might also, now that you are cognisant of your ability to trip yourself up, want to take out insurance against the worst happening. As Sharot advises, do carry an umbrella even if you’re sure the sun will shine, and do buy health insurance even if you’re certain nothing bad could ever happen on you.32
How Not to Spot Aspirin Poisoning
It’s not just bad news that we have a tendency to unwittingly dismiss.
Take what happened to Dr Harrison J. Alter, then an emergency-room physician in Tuba City, Arizona, a small town in the Painted Desert just a sixty-mile drive from the Grand Canyon National Park.33
Alter isn’t your usual medical-school type. Born in Chicago, he attended the prestigious Francis Parker prep school before heading off to Brown to get an undergraduate degree in comparative literature and art history.34 He still lists reading among his interests, along with his children,35 but his area of professional expertise is medical, and in particular the hullabaloo of the emergency room.
Alter’s emergency room was a focal point of medical care for the Navajo Nation, and during the winter of 2003 he took a routine admission in: Blanche Begaye, a Navajo woman in her sixties. Blanche explained that she was having trouble breathing. At first she just thought she just had ‘a bad head cold’, and did what most of us would do: she drank lots of orange juice and tea, popped a few aspirin, and expected that to be the end of it. But it wasn’t. She got worse.
Dr Alter knew that Blanche worked in a grocery store on the reservation. He also knew that something was amiss within her community. Lately, his emergency room had been full of people with similar symptoms to Blanche’s. He’d diagnosed them all with viral pneumonia, a nasty lung infection that can knock you out for weeks. Alter couldn’t help noticing that Blanche had quite similar symptoms.
After Blanche was admitted, Alter went through the normal procedures. First came the observations. He noticed that her respiratory rate was almost twice normal. On top of that, she had a low-grade fever – her temperature was up, but it wasn’t through the roof. Next came the tests – in particular, the bloods. The first thing he checked was the white-blood-cell count, a typical marker of infection. It wasn’t actually raised, but her blood chemistry showed that the acid-base balance of her blood was now weighted towards acid, a red flag for a major infection.
Alter totted up the signs, and plumped for a diagnosis of ‘subclinical viral pneumonia’. Mrs Begaye’s X-ray showed that she didn’t have the classic white streaks across her lungs, but he reasoned that this was because her illness was in its early stages. From here on it was routine: admit her, work on getting that fever down, and keep an eye on her heart rate. He had her hooked up to some intravenous fluids and medicine, and put her under observation. Then he moved on to the next bed.
Case closed.
Or so he thought.
A few minutes after Alter had begun evaluating his next patient, the junior doctor to whom he had passed Mrs Begaye’s case attracted his attention. Thankfully for Mrs Begaye, this wasn’t the type of subordinate who was too scared to speak up when he disagreed with his boss. ‘That’s not a case of viral pneumonia,’ he told Alter. ‘She has aspirin toxicity.’
How could Dr Alter, who had not only studied medicine at the prestigious University of California at Berkeley, but following his residency had gone to the University of Washington in Seattle to study medical decision-making, have got his diagnosis so wrong? As the always open-to-learning doctor later reflected, ‘Aspirin poisoning, bread-and-butter toxicology … She was an absolutely classic case – the rapid breathing, the shift in her blood electrolytes … and I missed it.’
It wasn’t that he hadn’t been asking the right questions, or conducting the right tests to get to this diagnosis. He had.
Dr Alter had all the information he needed to make the correct diagnosis right in front of him. The trouble was that even though the information had been plain to see, he hadn’t taken it in. And he hadn’t taken it in because when he saw Mrs Begaye, what came to mind were all the recent cases of pneumonia he’d been seeing.
This meant that instead of treating her story as an independent one, and focusing on all the information she gave him, he had zoomed in on the symptoms that fitted the pneumonia diagnosis, and ignored or reasoned away the information that didn’t.
This is a common thinking error we are all prone to. In fact, it turns out that we actually get a dopamine rush when we find confirming data, similar to the one we get if we eat chocolate, have sex, or fall in love.36