Washington DC

A scan of Terri Schiavo's brain (right) shows just how damaged it was compared with a normal one. Credit: P. R. WOLPE

Images have power — and Paul Root Wolpe knows just how persuasive they can be. A bioethicist at the University of Pennsylvania, Wolpe found himself confronted by angry members of the public during the debate over the fate of brain-damaged patient Terri Schiavo earlier this year.

People were outraged that Schiavo's husband wanted to remove her feeding tube, Wolpe recalls. But when he showed them computed tomography scans that compared Schiavo's atrophied brain with a normal brain, they changed their minds. “People paused, and they said: ‘Maybe Terry isn't there’.” Wolpe explains.

Last week, scientists and ethicists met in Washington DC to discuss the growing power of brain images to sway public opinion in areas such as medicine, crime and human rights. At the workshop, organized by a consortium of groups including the Dana Foundation, which funds research in neuroscience, they warned that neuroscientists must learn to use such emotive images responsibly, and to be more honest about the limitations of their work.

Researchers are flocking to the field of brain imaging, and they are probing brain activity not only in diseases such as depression, but also in basic processes such as consciousness, decision-making and deception. Imaging studies are almost ten times as popular now as they were a decade ago (J. Illes et al. Nature Neurosci. 6, 205; 2003).

Mind games: these composite scans show areas of the brain that are activated when someone doesn't tell the truth — but should such images form the basis of a lie detector? Credit: KRT/NEWSCOM

The images have been a hit with the public too, because they are easier to understand than complex information about neurons, synapses and neural architecture. But this is causing concern about how the pictures might be used.

“These images are quite seductive,” says Marcus Raichle, one of the pioneers of brain imaging at the Washington University School of Medicine in St Louis, Missouri. “It's intuitively easy to relate to a picture, and that's both good and bad.”

Because brain images seem to tell a cut-and-dried story, the public often accepts research findings long before the investigators themselves feel they really understand what the results mean. Daniel Schacter at Harvard University, for instance, has used functional magnetic resonance imaging (fMRI) to image the brain regions that are activated when we remember things. He found that when we have ‘false memories’ — when we do our best to tell the truth but remember incorrectly — the images show different patterns of brain activation from those seen for true memories (S. D. Slotnick and D. L. Schacter Nature Neurosci. 7, 664–672; 2004).

Schacter says his work is far from definitive, but that probably won't stop criminal investigators adopting the technique as a ‘lie detector’ test. “There's enormous public pressure to use these things,” adds Wolpe.

Neuroimaging studies also raise issues relating to fundamental human freedoms. For instance, fMRI research supports the idea that we use emotions and reason when we make decisions. But people with certain kinds of brain damage rely more exclusively on reason — and often make what most would see as disastrous errors of judgment. So images of decision-making pathways could be used to label certain people as sociopaths.

But this could violate long-held societal standards, such as the ‘freedom of thought’ guaranteed in the First Amendment to the US Constitution, says Stacey Tovino, a lawyer at the University of Houston Law Center in Texas. “The privacy issues are very difficult,” she notes.

The issues related to brain imaging are part of a larger debate about the implications of neuroscience research (see Nature 433, 185; 2005). In 2002, the Dana Foundation sponsored a meeting that is widely regarded to have given birth to the field of neuroethics. Since then, the science has advanced rapidly — and the ethical problems have grown just as fast.

Everyone at last week's meeting seemed to agree that neuroscientists now have a responsibility to explain the limitations of brain-imaging studies to the public. For example, most non-scientists don't understand that the images in research publications are generally composites of data from many individuals, not from a single person. Or that the images are not photos of what's actually happening in the brain — they are highly processed representations of data.

Some delegates went further, arguing that scientists have an additional obligation to be more rigorous in their research. Many neuroimaging studies focus on a small group of people, and their findings aren't confirmed by other investigators.

“We need to have a new sense of accountability and reproducibility,” says Judy Illes, director of the neuroethics programme at Stanford University. After all, she argues, with the public so eager for information about the brain, the least scientists can do is deliver the most accurate picture they can.