Advertisement

Coronavirus: are psychological biases causing politicians to make bad choices?

Crises rarely see human decision-making operating at its best. Politicians and policymakers have to make important decisions in unfamiliar circumstances, with vast gaps in the available information, and all in the full glare of public scrutiny. The psychology of decision making doesn’t just tell us a lot about the potential pitfalls in our own thinking – it alerts us to ways in which some of the world’s governments may go astray.

The power of precedent

Our minds tackle the future by referring to the past. The question of what to think or do is mostly answered by asking: what do I (or other people like me) normally think and do? This tends to make all of us, politicians included, assume nothing too dramatic is happening in the early stages of an epidemic. It also encourages an initial tendency to carry on with business as usual – at least until the crisis becomes visible.

We don’t like to be disturbed from our comfortable status quo, so we tend to ignore, downplay or simply fail to collect information that might conflict with this picture. Many governments initially denied the existence of COVID-19, attempted to silence those raising the alarm, or took few steps to search for cases. Many may still be downplaying the severity of the crisis.

As the crisis gets going, we search for analogies from past experience of other similar-looking crises. Perhaps COVID-19 is like seasonal flu, and we take no drastic actions to cope with that. Perhaps COVID-19 is like the deadly 1918 flu pandemic, with a particularly deadly second peak. Or it is more like Sars (another coronavirus), which infected 8,000 people in 2003, before being stamped out by aggressive infection control?

The power of stories

We reason about the world by constructing narratives. And the choice of narrative will be crucial. Suppose we think we are replaying the 1918 flu pandemic. Then we may reason that resistance is futile – the only way the pandemic will burn out is through most of the population becoming infected, when we will attain so-called herd immunity. So the goal of policy is then to spread infections as evenly as possible across time.

The narrative is one of stoical fatalism – we must accept a large death-toll, especially among the elderly and vulnerable, but manage it as best we can. The possible figures are sobering: if herd immunity requires 60% to 80% of the population to be infected, and assuming a very conservative death rate of 1 in 200, the death toll among the 66 million people in the UK, for example, would be about 200,000 people. If we scale up to the more than 7 billion people on the planet, the death toll will be 20 million – and probably far higher.

If, instead, we think we are replaying the Sars outbreak, albeit with a far more infectious virus, then the narrative is very different: with suitably drastic actions (social distancing, isolation, hand-washing, intensive testing, contact tracing and more), then the infection can be beaten back. This is the narrative that has driven China and South Korea, in radically reducing their numbers of cases.

Of course, on the first narrative, this may represent only a temporary reprieve – perhaps the disease will surge again, and perhaps be even more deadly than before. Or perhaps herculean national and global efforts can nonetheless stamp it out, or more likely hold it at bay until a vaccine or cure is developed.

One model thinking

The psychologist Philip Johnson-Laird once memorably remarked that the tendency to see only one possible model of a highly ambiguous and uncertain situation is perhaps the most pervasive and important error in human thinking. Looking at the famous duck-rabbit image, we see either a duck or a rabbit, but never both at once.

<span class="caption">What do you see? A duck or a rabbit?</span> <span class="attribution"><a class="link " href="https://commons.wikimedia.org/w/index.php?curid=667017" rel="nofollow noopener" target="_blank" data-ylk="slk:Wikimedia Commons;elm:context_link;itc:0;sec:content-canvas">Wikimedia Commons</a></span>
What do you see? A duck or a rabbit? Wikimedia Commons

Similarly, it is hard to wrench ourselves from our current narrative (say, stoical delay) and switch to another (say, aggressive countermeasures). This is particularly hard for politicians and policymakers, who are often accused of inconsistency, even when reacting to changed circumstances or evidence.

Overconfidence all round

A rigid focus on our own model of the world leads all of us – citizens, scientists, governments – to be overconfident. We see the duck-rabbit as a rabbit and are astonished to hear it quack. Indeed, we may even deny that it was a quack and stick to the “rabbit” theory.

But making good decisions requires accepting that our narratives are incomplete and quite possibly plain wrong. Suppose, for example, that aggressive countermeasures can work, following China and South Korea; if so, then across the world, many millions of lives might be saved. If you believe these measures may be futile, there may be significant and perhaps unnecessary economic disruption (though surely a far lesser evil). Whatever we think is the right story, we are almost certainly more sure than we should be, whether we are politicians, epidemiologists or concerned citizens.

The natural human tendency is, then, to ask first: what is the one true story? And second, assuming this is the right story, what is the best thing to do? For example, if we think resistance is futile, then we recommend against early, aggressive action. If we believe that people can’t spread COVID-19 while asymptomatic, we may recommend against closing mass events.

This is very dangerous. In extreme uncertainty, we need to take actions that are robust, that work pretty well, even if our narrative turns out to be wrong. Sometimes, just buying ourselves time may be vital, while we find out more. That would suggest clamping down on the virus as much as possible, as a precautionary first step. Perhaps, for some reason, this has its own dangers – but surely it is at least the right starting point for debate.

But when it comes to making decisions, the only real counter to our psychological biases is transparency – then we can fix the holes in each other’s thinking. Governments across the world need now, more than ever, to explain their assumptions, their plans, and what they expect may happen, however alarming. In short, governments must open their thinking for public scrutiny and critical debate – both to help make the right decisions and to get us, the people, to back them.

This article is republished from The Conversation under a Creative Commons license. Read the original article.

The Conversation
The Conversation

Nick Chater receives funding from EPSRC and ESRC. He is Professor of Behavioural Science at Warwick Business School, a member of the UK Committee on Climate Change, and a director of Decision Technology, Ltd.