We only use 10% of our brains. Advertising executives know it, self-help authors know it, and so do many ordinary people. Indeed, the persistence of this idea seems to be a global phenomenon; according to a recent survey in Brazil, about half the college-educated people in Rio de Janeiro believe it to be true1. Academics may be tempted to wonder whether those who continue to spread this stubborn myth really do use less of their brains than the rest of us. A more optimistic view, however, is that brain myths reflect people's deep interest in understanding how their own brains work, and that beyond the misconceptions lies an opportunity to convert enthusiasm into knowledge.

The Brazilian survey, one of the most detailed from any country, was conducted by Suzana Herculano-Houzel of the Rio Museum of Life Science, in an attempt to target the museum's exhibits to the local community. Some facts were widely known—that brain regions are specialized for different functions, and that emotions occur in the brain rather than the heart. Similarly, almost all respondents knew that addictive drugs act on the brain. Other ideas—that language is not hereditary, for instance—seemed to depend on education, as they were understood by respondents with college or graduate educational experience, but not by high school students. About half of the participants did not know that learning occurs through changes in brain connections. In addition to the 10% myth, widespread misconceptions included beliefs that the brain has a single memory system, that coma is similar to sleep, and that damaged brains do not show functional reorganization. One wonders, of course, whether similar results would be obtained in other countries, but it seems likely that many of the misconceptions prevalent in Rio will be familiar to neuroscientists throughout the world.

Where do brain myths come from, and why are they so persistent? The origin of the 10% claim remains uncertain, despite considerable research2. It is often attributed to William James, who expressed a similar idea in a 1906 speech to the American Psychological Association: “Compared to what we ought to be, we are only half awake. We are making use of only a small part of our physical and mental resources.” But the 10% number has not been found in any of James' writings. Alternatively, the myth may have originated from an early misinterpretation of interneurons as undeveloped neurons, leading to the speculation that they might be a reserve pool for neural replacement later in life. Another potential source of this myth is the difficulty encountered by early neurophysiologists, notably Karl Lashley, in identifying functional defects caused by lesions of particular brain regions. Indeed, the term 'silent cortex' was once commonly used to describe regions without a clear sensory or motor function, and this could easily have been misinterpreted to mean 'unused cortex'.

Whatever its origin, the 10% myth has been a staple of the self-help movement for at least a century. There is an obvious appeal to the idea that one's own brain is a vast reservoir of untapped potential, and those who make a living by selling the promise of self-improvement have not hesitated to use this myth to add a veneer of scientific plausibility to their claims. Similarly, the 10% myth is a favorite justification for paranormal beliefs—psychic powers being located in the 90% of the brain whose function remains beyond the reach of science.

The exploitation by charlatans of popular misconceptions about the brain is of course not new. Phrenology, for example (which Ambrose Bierce memorably defined as “the science of picking the pocket through the scalp”), was spread through Europe and the United States in the early 19th century largely by itinerant lecturers who claimed to read people's personalities through the bumps on their head3. Our predecessors' enthusiasm for phrenology may seem laughable today, but many current myths will probably seem equally absurd to future generations. One of the most widespread is the idea, promoted by journalists, educators and even politicians, that early environmental experiences, such as listening to music, can make babies smarter by promoting synapse formation4. In some versions of this myth, listening to classical music or more active forms of 'brain exercise' can make adults smarter as well. Another persistent neuroscience myth, the division between the logical left brain and the emotional right brain, is also kept alive by the self-help literature, despite considerable scientific criticism. The idea of latent talents is prominent in these accounts of the dormant right hemisphere, source of untapped creative powers.

Artemus Ward wrote (and many others have since repeated), “It ain't so much the things we don't know that get us in trouble. It's the things we know that ain't so.” Most people would consider brain myths to be a relatively harmless exception to this rule, deserving more of amusement than censure. On the other hand, their persistence represents an interest in understanding the brain that could give neuroscientists an opportunity to engage and educate the public about the field. So the next time the guy sitting next to you on an airplane says, “Hey, you study the brain? I've heard we only use 10% of our brains”, instead of rolling your eyes, consider telling him some true stories about how the brain works.