Everybody knows that overconfidence can be foolhardy. But a study reveals that having an overly positive self-image might confer an evolutionary advantage if the rewards outweigh the risks. See Letter p.317
Ask anyone with a driver's licence to rate their own abilities behind the wheel, and most people will report that they are above average1. The same is true for self-assessments of performance in cognitive tasks2, of attractiveness3 (by men, not by women) and of the healthiness of our behaviour4: people typically place themselves higher on the ladder than they really are. In a survey of 1 million high-school students5, a solid 70% rated themselves as above-average leaders (versus 2% who thought of themselves as below average), and a spectacular 94% of college professors possess teaching abilities that are above average — according to themselves6.
Obviously they cannot all be right, but that does not make them dysfunctional or mentally unhealthy. In fact, one way to get self-assessments to obey some minimal aggregate consistency is to restrict surveys to sufficiently depressed people7 (although this finding has been questioned8,9). Mentally healthy people blissfully suffer from what are called positive illusions: they overestimate their abilities, as well as their control over events, and they underestimate their vulnerability to risk10. Of course, one can overrate oneself too much, as do sufferers from narcissistic personality disorder or megalomania, but healthy people's estimates of their own abilities seem to start just a little above where they really are. Reporting on page 317 of this issue, Johnson and Fowler11 describe a model that might explain why this is so.
An obvious question is how overconfidence survives the process of natural selection. The prevalence of rose-tinted self-assessments suggests that it might even be adaptive to be overconfident — in contrast to schizophrenia, for instance, which is maladaptive but nonetheless exists in moderate proportions in humans. But how can it be adaptive to misjudge how you compare with others? You would think that an incorrect assessment of one's own capabilities can induce only misguided decisions.
One suggested explanation is that there is a benefit in having others think that you're great. And as there is no better way of being a strong persuader than firmly believing in yourself, this would lead to an upward bias in how people perceive themselves compared with others12. That may lead to a mistake here and there, but the benefits of the esteem of others could outweigh that (Fig. 1).
Johnson and Fowler11 suggest a remarkable alternative explanation. According to their model, a biased self-belief can actually lead people to make the right decision, whereas an unbiased self-image would lead to a suboptimal decision. That sounds counterintuitive, but the key lies in the authors' departure from what could be called the 'naive economist's' idea of how humans arrive at decisions ('naive' because many economists are not that naive at all).
The authors' model envisages a valuable resource that two individuals can decide to claim or not. If both claim it, then they will fight over it — which is costly for both. The stronger individual will win the fight and gain access to the resource. If only one of them claims the resource, it goes to that person. If neither claims it, no one gets it.
Now if both contenders could simply assess the fighting strength of the other with perfect accuracy, the optimal strategy would be a no-brainer: fight if you are stronger, concede if you are weaker. But it gets interesting if the contestants have imperfect information about each other's strength. In this situation, contestants might back off because they think their opponent is stronger than he or she really is. A weaker contestant could then win a reward if she claims it while the opponent backs off.
This situation can be dealt with within the realm of what economists call perfect rationality, which assumes that both parties understand all aspects of their situation, and that they correctly anticipate the odds that the other player will claim the resource. But Johnson and Fowler suggest that there is a short cut to the right decision. The short cut combines a simple heuristic — fight if you think you're stronger — with a bias. If the resource is valuable relative to the cost of fighting, then the risk of an extra battle here and there is outweighed by the gains made when otherwise unclaimed resources are won, which makes overestimating one's own fighting abilities worthwhile. If the cost of fighting is large relative to the value of the resource, then it is better to underestimate one's own strength. The behaviours described by the authors' model are actually more complex than described above, because the model also predicts that populations can, for instance, evolve to a stable mixture of both over- and under-confident people.
Another evolutionary explanation is the following: overconfidence could reduce average pay-off, but top performers will still come from the group of overconfident individuals. For example, overconfidence about roulette-playing 'abilities' will lead to overall losses from this game, but the best performers will have played often. Strong selection — as in 'winner takes all' — should favour overconfidence.
Johnson and Fowler's study11 prompts a variety of interesting questions. The 'winning strategy' (for low fighting costs) can be wired into the brain in two ways. The first involves a simple heuristic plus overconfidence: only fight when you think you are stronger, but overestimate your strength. The second way involves perfect rationality without overconfidence: given some uncertainty, the winning strategy can be to fight opponents even if they seem slightly stronger than you. Future empirical and theoretical studies might help to decide which of these two describes us best.
It would also be interesting to establish a link between the authors' findings and overconfidence in trading behaviour13, the willingness to buy overly complex financial products (which are thought to have led to the current crisis in the banking system), political decisions that lead to war14, and the evolution of fighting behaviour in animals15. Given that 94% of college professors rate themselves as above average, there should be enough overconfidence around to tackle all the natural follow-up questions.
Svenson, O. Acta Psychol. 47, 143–148 (1981).
Kruger, J. & Dunning, D. J. Pers. Social Psychol. 77, 1121–1134 (1999).
Gabriel, M. T., Critelli, J. W. & Ee, J. S. J. Pers. 62, 143–155 (1994).
Hoorens, V. & Harris, P. Psychol. Health 13, 451–466 (1998).
Alicke, M. D. & Govorun, O. in The Self in Social Judgment (eds Alicke, M. D., Dunning, D. A. & Krueger, J. I.) 85–106 (Psychology, 2005).
Cross, P. New Directions Higher Educ. 17, 1–15 (1977).
Taylor, S. E. & Brown, J. D. Psychol. Bull. 103, 193–210 (1988).
Shedler, J. et al. Am. Psychol. 48, 1117–1131 (1993).
Colvin, C. R. & Block, J. Psychol. Bull. 116, 3–20 (1994).
Sharot, T. The Optimism Bias: A Tour of the Irrationally Positive Brain (Pantheon, 2011).
Johnson, D. D. P. & Fowler, J. H. Nature 477, 317–320 (2011).
Trivers, R. Deceit and Self-Deception: Fooling Yourself the Better to Fool Others (Allen Lane, 2011).
Barber, B. M. & Odean, T. Q. J. Econ. 116, 261–292 (2001).
Johnson, D. D. P. Overconfidence and War: The Havoc and Glory of Positive Illusions (Harvard Univ. Press, 2004).
Enquist, M. & Leimar, O. J. Theor. Biol. 127, 187–205 (1987).
About this article
Journal of Gambling Studies (2017)
Lack of quantitative training among early-career ecologists: a survey of the problem and potential solutions
Chaos, Solitons & Fractals (2013)
Slippery slopes. Some considerations for favoring a good marriage between education and the science of the mind–brain–behavior, and forestalling the risks
Trends in Neuroscience and Education (2013)
Journal of Psychiatric Practice (2013)