One of the painful things about our time is that those who feel certainty are stupid, and those with any imagination and understanding are filled with doubt and indecision.
Every successful leader will have one thing in common: they trust their judgement. And why not? Their intuitions must have proved their worth otherwise they wouldn’t be successful, right?
Well, maybe not. Psychologist Daniel Kahneman suggests that “the amount of success it takes for leaders to become overconfident isn’t terribly large.” Kahneman’s paper, co-authored with Gary Klein, Conditions for Intuitive: Expertise A Failure to Disagree, argues that overconfidence is at the root of most poor decisions.
In this interview, the two psychologists discuss their thoughts about how to improve judgements.
Kahneman argues “Overconfidence is a powerful source of illusions, primarily determined by the quality and coherence of the story that you can construct, not by its validity.” The stories we tell ourselves are often post-hoc rationalisations constructed after the fact: results went up because of the excellent decisions I made. There may be a whole host of alternative explanations, the most compelling narrative wins out.
But why are we so seduced by the narrative of overconfidence?
Most of the time we are almost pathologically averse to uncertainty. We hate not knowing and would prefer to wrong than unsure. Klein points out that we like leaders who “are good at making others feel confident in their judgment, even if there’s no strong basis for the judgment.” Kahneman tells us that very often leaders “achieve a reputation for great successes when in fact all they have done is take chances that reasonable people wouldn’t take.”
Let that sink in for a moment. Leaders seem to possess two important qualities: firstly they are lucky risk takers, and secondly they make the rest of us feel good about taking these risks. Our desire for certainty overrules our need to make rational, informed decisions, even when the stakes are high.
And here’s the kicker: lucky risk takers use hindsight to reinforce their feeling that their gut is very wise. Hindsight creates a powerful illusion that the situation was clearer than it really was and that the outcome was always certain. We are relieved to have a strong leader to cut though uncertainty and make bold decisions. After all, the risks have paid off, right? Well, yes, but for how long? Unless you believe in the supernatural, luck is just a matter of probability. Sooner or later even the luckiest guesser is going to be wrong.
This intolerance of uncertainty favours the overconfident. As I wrote in my new book:
Certainty seems to indicate a lack of nuance and sophistication in our thinking. The Dunning-Kruger effect is the finding that the poorest performers are the least aware of their own incompetence. Or put more crudely: stupid people are too stupid to recognise their own stupidity.
After comparing participants’ tests results with their self-assessment of their performance in such diverse fields as sense of humour, grammar and logic, Dunning and Kruger proposed that, for a given skill, the incompetent not only fail to recognise their own lack of skill but also fail to recognise genuine skill in others. Encouragingly, they also found that if incompetents are given training in an area at which they are identified as being unskilled, they are able to recognise and acknowledge their own previous lack of skill.[i] As Dunning observes, “If you’re incompetent, you can’t know you’re incompetent… the skills you need to produce a right answer are exactly the skills you need to recognize what a right answer is.”[ii]
This is not to say that all leaders are foolish: clearly this is not the case, but it does suggest that most leaders will have a tendency towards overconfidence and this can easily make them blind to their own bias and prejudices.
Klein and Kahneman suggest a suite of tools for overcoming overconfidence and exploiting the advantages of maintain uncertainty:
If we want to overcome the pernicious effects of over-confidence we should think ourselves into the following scenario: Imagine you are one year down the line from introducing the new policy or project and it has gone spectacularly and horribly wrong. Spend five minutes detailing all the things that contributed the projects failure. This process unleashes our imaginations to work exactly where it should: on the uncontrollable and the external. Kahneman says, “in general, doing a premortem on a plan that is about to be adopted won’t cause it to be abandoned. But it will probably be tweaked in ways that everybody will recognize as beneficial. So the premortem is a low-cost, high-payoff kind of thing.”
Klein admits that he’s not an uncritical advocate of checklists warning that complex and ambiguous require greater levels of expertise:
Checklists are about if/then statements. The checklist tells you the “then” but you need expertise to determine the “if”—has the condition been satisfied? In a dynamic, ambiguous environment, this requires judgment, and it’s hard to put that into checklists.
Kahneman disagrees, arguing that these are precisely the situations where checklists are most needed: “The checklist doesn’t guarantee that you won’t make errors when the situation is uncertain. But it may prevent you from being overconfident.” He suggests that the most useful checklists force us to think about process rather than content: not so much, Have we ticked off a list of things to include? More a case of, Have we considered all the ways we might be making a mistake?
Items to include on such a list might be:
- What is the quality and independence of our information?
- Is it coming from multiple sources or just one source that’s being regurgitated in different ways?
- Is there a possibility of group- think?
- Does the leader have an opinion that seems to be influencing others?
3. Postposing consensus
Putting off the moment where everyone, however reluctantly or enthusiastically, agree on a course of action might be a good thing. As Kahneman says, “Fragmenting problems and keeping judgments independent helps decorrelate errors of judgment.”
When people are asked to estimate the number of coins in a jar, the accuracy of estimates’ average goes up with the number of guess, but only as long as each guesser is ignorant of every other guess. When the guessing is public, the first guess anchors all subsequent guesses and accuracy doesn’t improve.
To put off the moment where decision making is anchored on a particular course of action, Kahneman suggests that everyone involved in the decision-making process should write out their proposal before a meeting and then each idea can be discussed on its merits.
4. Educating gossip
Kahneman advises that the only reliable way to confront our biases is to learn to critique other people. Using the language of psychology – terms like anchoring, availability and overconfidence – when discussing our plans and processes might elevate institutional gossip to a level where we might learn from each other. But don’t underestimate how hard it might be to encourage this kind of culture. Kahneman says, “Leaders know that any procedure they put in place is going to cause their judgment to be questioned. And whether they’re fully aware of it or not, they’re really not in the market to have their decisions and choices questioned.”
5. Improving meetings
Kahneman’s best advice for improving meetings is to make them shorter. He suggests that people should be given as much information as possible before meetings and asked to arrive at an independent judgement before the meeting.
There is a caveat to this: if everyone knows the leader’s preference in advance, this will influence the way in which decisions are researched and data is collected. A wise leader will go to great lengths to prevent others from trying to confirm her biases. As Kahneman puts it, “You want to create the possibility that people can discover that an idea is a lousy one early in the game, before the whole machinery is committed to it.
Klein makes the following warning:
The tendency to marginalize people who disagree with you at meetings. There’s too much intolerance for challenge. As a leader, you can say the right things—for instance, everybody should share their opinions. But people are too smart to do that, because it’s risky. So when people raise an idea that doesn’t make sense to you as a leader, rather than ask what’s wrong with them, you should be curious about why they’re taking the position. Curiosity is a counterforce for contempt when people are making unpopular statements.
All this can be boiled down to the exhortation, seek to explore rather than confirm your biases.
[i] Kruger, Justin; Dunning, David (1999). “Unskilled and Unaware of It: How Difficulties in Recognizing One’s Own Incompetence Lead to Inflated Self-Assessments”. Journal of Personality and Social Psychology 77 (6): 1121–34. doi:10.1037/0022-
[ii] New York Times: Interview with David Dunning, 20. June 2010