Overconfidence in a Complex World: The Dunning-Kruger Effect
In one of the greatest works of science fiction, the Foundation series, Isaac Asimov described a technologically advanced and prosperous galactic empire enter a period of severe economic, scientific and political degeneration. His futuristic reimagination of the Dark Ages raised the question, ‘could this ever happen to our world?’
So lofty a subject requires far more thorough treatment than I could offer here. But without being drawn one way or the other, I note that the growing attitude of hostility and derision being taken towards experts provides ammunition for those who argue that it could.
There is much data on the anti-expert trend, for example, a Pew Research Centre study published in November 2023 which found that since 2019, US public trust in scientists has been in steady decline. In his book ‘The Death of Expertise’, Thomas Nichols argues that we have gone beyond a healthy scepticism of experts and now seem to ‘actively resent them’. He despairs that we have reached a farcical position where ‘we’re proud of not knowing things’.
Incidentally, Kier Stramer’s decision to appoint a smattering of experts from outside the world of politics into his government represents a notable, and commendable, bucking of this trend. But the direction of travel more broadly remains clear; and Starmer is the exception not the rule.
Whilst there are myriad contributing factors to this trend – not least the perceived failure of experts during the economic crisis of 2008 and the Covid-19 pandemic – an experiment conducted in 1999 by Justin Kruger and David Dunning provides insight into a mechanism that may be driving it at a deep psychological level.
The pair asked participants to complete a range of competency-based tasks, for example a logical reasoning test, and then to predict how they scored relative to the other participants. Their seminal finding was that those most likely to overestimate their ability were those who performed worse, while those most likely to underestimate their ability were those who performed best.
They had given empirical support to the line, sometimes attributed to Bertrand Russel, that ‘fools and fanatics are always so certain of themselves, and wiser people so full of doubts’. Dunning and Kruger argued that poor performers overstate their ability because ‘the skills that engender competence in a particular domain are often the very same skills necessary to evaluate competence in that domain’. In other words, poor performers don’t know what they don’t know.
This phenomenon, termed the Dunning-Kruger effect, has the capacity to interact dangerously with the rapid propulsion of scientific and technological innovation which is making our world increasingly complex and esoteric. As the issues in society become more complicated we are all, in effect, becoming less competent. We find ourselves in a world that is drifting ever further from the one our brains have evolved to function in.
It is a non-starter to expect our knowledge to keep pace with the developments in all of the fields which impact society today. And the only rational response to this is to increase our reliance on experts in their designated fields. But the Dunning Kruger effect suggests that as we become less competent, we develop an increasingly misplaced sense of confidence.
There also seems to be an innate frustration in society regarding our dwindling grasp of the world we live in. This may be why the reaction to experts, who act as an almost visceral reminder of how little we understand, is often not just neglect or indifference, but overt anger.
Worryingly, given that the world is likely to continue to get more complicated, the issue of the Dunning-Kruger effect may become more pronounced. Indeed, the only way for people to re-establish their intellectual grasp of the world would be for an Asimov-style Dark Age to wind back the clock.
However, as is often the case with psychological tendencies, we have the capacity to overcome this effect by making a conscious effort to overcome it. Indeed, simply being aware of the issue is likely to lead one to be more humble and contrite when dealing in unfamiliar territory. But I also propose three recommendations.
Firstly, actively seeking and embracing objective feedback. Rather than operating within protective echo chambers, people should value critical feedback and view it as an opportunity for growth rather than a psychological threat.
Secondly, becoming a lifelong learner. With powerful search engines at our fingertips, there is a tendency to underestimate the value of developing an internal understanding of the world. Many people, for example, use online maps to such an extent that they can hardly navigate around the area they live in. It is important that people recognise both the pragmatic benefits as well as the intrinsic value of the pursuit of knowledge.
Thirdly, becoming comfortable with saying ‘I don’t know’. Today we are often encouraged to present strong opinions; putting one’s stake firmly in the ground is lauded, whilst admitting you are unsure is taken to indicate low confidence. But on this I am reminded of a lecturer who once told us that ‘whilst in politics everyone’s opinion matters, in science no one’s opinion matters’. His view was that there must be matters on which not everyone is entitled to advance an opinion. On many issues we would do well to follow the adage, ‘better to remain silent and be thought a fool than to speak out and remove all doubt’.