Why We Make Bad Decisions

Six years ago I was struck down with a mystery illness. My weight dropped by 30 pounds in three months. I experienced searing stomach pain, felt utterly exhausted and no matter how much I ate, I couldn’t gain an ounce.

I went from slim to thin to emaciated. The pain got worse, a white heat in my belly that made me double up unexpectedly in public and in private. Delivering on my academic and professional commitments became increasingly challenging.

It was terrifying. I did not know whether I had an illness that would kill me or stay with me for the rest of my life or whether what was wrong with me was something that could be cured if I could just find out what on earth it was.

Trying to find the answer, I saw doctors in London, New York, Minnesota and Chicago.

I was offered a vast range of potential diagnoses. Cancer was quickly and thankfully ruled out. But many other possibilities remained on the table, from autoimmune diseases to rare viruses to spinal conditions to debilitating neural illnesses.

Treatments suggested ranged from a five-hour, high-risk surgery to remove a portion of my stomach, to lumbar spine injections to numb nerve paths, to a prescription of antidepressants.

Faced with all these confusing and conflicting opinions, I had to work out which expert to trust, whom to believe and whose advice to follow. As an economist specializing in the global economy, international trade and debt, I have spent most of my career helping others make big decisions — prime ministers, presidents and chief executives — and so I’m all too aware of the risks and dangers of poor choices in the public as well as the private sphere. But up until then I hadn’t thought much about the process of decision making. So in between M.R.I.’s, CT scans and spinal taps, I dove into the academic literature on decision making. Not just in my field but also in neuroscience, psychology, sociology, information science, political science and history.

What did I learn?

Physicians do get things wrong, remarkably often. Studies have shown that up to one in five patients are misdiagnosed. In the United States and Canada it is estimated that 50,000 hospital deaths each year could have been prevented if the real cause of illness had been correctly identified.

Yet people are loath to challenge experts. In a 2009 experiment carried out at Emory University, a group of adults was asked to make a decision while contemplating an expert’s claims, in this case, a financial expert. A functional M.R.I. scanner gauged their brain activity as they did so. The results were extraordinary: when confronted with the expert, it was as if the independent decision-making parts of many subjects brains pretty much switched off. They simply ceded their power to decide to the expert.

If we are to control our own destinies, we have to switch our brains back on and come to our medical consultations with plenty of research done, able to use the relevant jargon. If we can’t do this ourselves we need to identify someone in our social or family network who can do so on our behalf.

Anxiety, stress and fear — emotions that are part and parcel of serious illness — can distort our choices. Stress makes us prone to tunnel vision, less likely to take in the information we need. Anxiety makes us more risk-averse than we would be regularly and more deferential.

We need to know how we are feeling. Mindfully acknowledging our feelings serves as an “emotional thermostat” that recalibrates our decision making. It’s not that we can’t be anxious, it’s that we need to acknowledge to ourselves that we are.

It is also crucial to ask probing questions not only of the experts but of ourselves. This is because we bring into our decision-making process flaws and errors of our own. All of us show bias when it comes to what information we take in. We typically focus on anything that agrees with the outcome we want.

We need to be aware of our natural born optimism, for that harms good decision making, too. The neuroscientist Tali Sharot conducted a study in which she asked volunteers what they believed the chances were of various unpleasant events’ occurring — events like being robbed or developing Parkinson’s disease. She then told them what the real chances of such an event happening actually were. What she discovered was fascinating. When the volunteers were given information that was better than they hoped or expected — say, for example, that the risk of complications in surgery was only 10 percent when they thought it was 30 percent — they adjusted closer to the new risk percentages presented. But if it was worse, they tended to ignore this new information.

This could explain why smokers often persist with smoking despite the overwhelming evidence that it’s bad for them. If their unconscious belief is that they won’t get lung cancer, for every warning from an antismoking campaigner, their brain is giving a lot more weight to that story of the 99-year-old lady who smokes 50 cigarettes a day but is still going strong.

We need to acknowledge our tendency to incorrectly process challenging news and actively push ourselves to hear the bad as well as the good. It felt great when I stumbled across information that implied I didn’t need any serious treatment at all. When we find data that supports our hopes we appear to get a dopamine rush similar to the one we get if we eat chocolate, have sex or fall in love. But it’s often information that challenges our existing opinions or wishful desires that yields the greatest insights. I was lucky that my boyfriend alerted me to my most dopamine-drugged moments. The dangerous allure of the information we want to hear is something we need to be more vigilant about, in the medical consulting room and beyond.

My own health story had a happy ending. I was finally given a diagnosis of a rare lymphatic vessel condition, and decided that surgery made sense. Not the five-hour surgical intervention that would have left me in bed recovering for more than three months, but a much less intrusive keyhole surgery with a quick recovery. I chose a surgeon who wasn’t overly confident. I’d learned in my research that the super-confident, doctor-as-god types did not always perform well. One study of radiologists, for example, reveals that those who perform poorly on diagnostic tests are also those most confident in their diagnostic prowess.

My surgery went well. The pain subsided, the pounds gradually came back on. I am now cured.

With brain switched on and eyes wide open, we can’t always guarantee a positive outcome when it comes to a medical decision, but we can at least stack the odds in our favor.

Noreena Hertz is a professor of economics at University College London and the author of Eyes Wide Open: How to Make Smart Decisions in a Confusing World.

Deja una respuesta

Tu dirección de correo electrónico no será publicada. Los campos obligatorios están marcados con *