Letter | Published:

A solution to the single-question crowd wisdom problem

Nature volume 541, pages 532535 (26 January 2017) | Download Citation


Once considered provocative1, the notion that the wisdom of the crowd is superior to any individual has become itself a piece of crowd wisdom, leading to speculation that online voting may soon put credentialed experts out of business2,3. Recent applications include political and economic forecasting4,5, evaluating nuclear safety6, public policy7, the quality of chemical probes8, and possible responses to a restless volcano9. Algorithms for extracting wisdom from the crowd are typically based on a democratic voting procedure. They are simple to apply and preserve the independence of personal judgment10. However, democratic methods have serious limitations. They are biased for shallow, lowest common denominator information, at the expense of novel or specialized knowledge that is not widely shared11,12. Adjustments based on measuring confidence do not solve this problem reliably13. Here we propose the following alternative to a democratic vote: select the answer that is more popular than people predict. We show that this principle yields the best answer under reasonable assumptions about voter behaviour, while the standard ‘most popular’ or ‘most confident’ principles fail under exactly those same assumptions. Like traditional voting, the principle accepts unique problems, such as panel decisions about scientific or artistic merit, and legal or historical disputes. The potential application domain is thus broader than that covered by machine learning and psychometric methods, which require data across multiple questions14,15,16,17,18,19,20.

Access optionsAccess options

Rent or Buy article

Get time limited or full article access on ReadCube.


All prices are NET prices.


  1. 1.

    Vox populi. Nature 75, 450–451 (1907)

  2. 2.

    Infotopia: How Many Minds Produce Knowledge (Oxford University Press, USA, 2006)

  3. 3.

    The Wisdom of Crowds (Anchor, 2005)

  4. 4.

    & Identifying expertise to extract the wisdom of crowds. Manage. Sci. 61, 267–280 (2014)

  5. 5.

    et al. Psychological strategies for winning a geopolitical forecasting tournament. Psychol. Sci. 25, 1106–1115 (2014)

  6. 6.

    & TU Delft expert judgment data base. Reliab. Eng. Syst. Saf. 93, 657–674 (2008)

  7. 7.

    Use (and abuse) of expert elicitation in support of decision making for public policy. Proc. Natl Acad. Sci. USA 111, 7176–7184 (2014)

  8. 8.

    et al. A crowdsourcing evaluation of the NIH chemical probes. Nat. Chem. Biol. 5, 441–447 (2009)

  9. 9.

    A route to more tractable expert advice. Nature 463, 294–295 (2010)

  10. 10.

    , , & How social influence can undermine the wisdom of crowd effect. Proc. Natl Acad. Sci. USA 108, 9020–9025 (2011)

  11. 11.

    , & Eliminating public knowledge biases in information-aggregation mechanisms. Manage. Sci. 50, 983–994 (2004)

  12. 12.

    , , & Intuitive biases in choice versus estimation: implications for the wisdom of crowds. J. Consum. Res. 38, 1–15 (2011)

  13. 13.

    Psychology. Tapping into the wisdom of the crowd–with confidence. Science 336, 303–304 (2012)

  14. 14.

    & Test theory without an answer key. Psychometrika 53, 71–92 (1988)

  15. 15.

    , , & Inferring expertise in knowledge and prediction ranking tasks. Top. Cogn. Sci. 4, 151–163 (2012)

  16. 16.

    , , & The wisdom of the crowd in combinatorial problems. Cogn. Sci. 36, 452–470 (2012)

  17. 17.

    & Using cognitive models to combine probability estimates. Judgm. Decis. Mak. 9, 259–273 (2014)

  18. 18.

    & Cultural consensus theory for multiple consensus truths. J. Math. Psychol. 56, 452–469 (2012)

  19. 19.

    , & Hierarchical Bayesian modeling for test theory without an answer key. Psychometrika 80, 341–364 (2015)

  20. 20.

    & A decision-theoretic generalization of on-line learning and an application to boosting. J. Comput. Syst. Sci. 55, 119–139 (1997)

  21. 21.

    & Models of ecological rationality: the recognition heuristic. Psychol. Rev. 109, 75–90 (2002)

  22. 22.

    Experts in Uncertainty: Opinion and Subjective Probability in Science (Oxford University Press, USA, 1991)

  23. 23.

    When are two heads better than one and why? Science 336, 360–362 (2012)

  24. 24.

    A Bayesian truth serum for subjective data. Science. 306, 462–466 (2004)

  25. 25.

    , & Measuring the prevalence of questionable research practices with incentives for truth telling. Psychol. Sci. 23, 524–532 (2012)

  26. 26.

    et al. Economics. The promise of prediction markets. Science 320, 877–878 (2008)

  27. 27.

    , , & Automatic integration of confidence in the brain valuation signal. Nat. Neurosci. 18, 1159–1167 (2015)

Download references


We thank M. Alam, A. Huang and D. Mijovic-Prelec for help with designing and conducting Study 3, and D. Suh with designing and conducting Study 4b. Supported by NSF SES-0519141, Institute for Advanced Study (Prelec), and Intelligence Advanced Research Projects Activity (IARPA) via the Department of Interior National Business Center contract number D11PC20058. The US Government is authorized to reproduce and distribute reprints for Government purposes notwithstanding any copyright annotation thereon. The views and conclusions expressed herein are those of the authors and should not be interpreted as necessarily representing the official policies or endorsements, either expressed or implied, of IARPA, DoI/NBC, or the US Government.

Author information


  1. Sloan School of Management, Massachusetts Institute of Technology, Cambridge, Massachusetts 02139, USA

    • Dražen Prelec
  2. Department of Economics, Massachusetts Institute of Technology, Cambridge, Massachusetts 02139, USA

    • Dražen Prelec
  3. Department of Brain & Cognitive Sciences, Massachusetts Institute of Technology, Cambridge, Massachusetts 02139, USA

    • Dražen Prelec
    •  & John McCoy
  4. Princeton Neuroscience Institute and Computer Science Department, Princeton University, Princeton, New Jersey 08544, USA

    • H. Sebastian Seung


  1. Search for Dražen Prelec in:

  2. Search for H. Sebastian Seung in:

  3. Search for John McCoy in:


All authors contributed extensively to the work presented in this paper.

Competing interests

The authors declare no competing financial interests.

Corresponding author

Correspondence to Dražen Prelec.

Reviewer Information Nature thanks A. Baillon, D. Helbing and the other anonymous reviewer(s) for their contribution to the peer review of this work.

Extended data

Supplementary information

PDF files

  1. 1.

    Supplementary Information

    This file contains Supplementary Text and Data sections 1-3 – see contents page for details.

About this article

Publication history






Further reading


By submitting a comment you agree to abide by our Terms and Community Guidelines. If you find something abusive or that does not comply with our terms or guidelines please flag it as inappropriate.