Editorial

Nature 441, 907 (22 June 2006) | doi:10.1038/441907a; Published online 21 June 2006

Neuroethics needed

Researchers should speak out on claims made on behalf of their science.

How would you feel if you had to tell the truth, and nothing but the truth, for a day? Did your wife really look so good in that dress? Were you honestly late just because the traffic was bad? Did you actually do all the necessary controls for that experiment?

Society would be a different place if all our lies, however trivial, were abandoned in favour of blunt honesty. In some areas, such as criminal prosecution, this might be advantageous. In others, where little white lies help life run smoothly, knowing all the facts might be uncomfortable.

These thoughts are brought to the fore by the arrival of two US start-up companies, No Lie MRI and Cephos, which are about to offer functional magnetic resonance imaging (fMRI) brain scans in order to detect lies. The companies, which plan to launch their services later this year, say their goal is to help exonerate the innocent, and to replace the widely discredited polygraph machines used by US government agencies for screening their staff (see page 918).

Many neuroscientists think the claims being made for fMRI are overblown. They warn that there is scant evidence that it can reliably distinguish a lie from the truth in any individual case, especially in the real-life, high-stakes situations in which it might be applied.

Ethicists worry even more about what would happen if one day the scanning technique could be used to accurately discern people's inner secrets. Society would, for the first time, hold in its hands a reliable tool with which to finger deceit, and this could have a profound impact on individual privacy and human rights.

It is too early to tell if fMRI will ever be able to pinpoint liars in anything but a few, tightly controlled circumstances. Studies so far have been conducted under such conditions, and do not reflect the many types of lies and the situations they may be used to investigate. Some will argue that the mere threat of an accurate lie test could be used to extract valuable information — the same argument, in essence, that led to the use of polygraphs in the United States.

The question of how far such approaches should be taken is just one of a number of pressing ethical issues raised by the rapid recent progress of neuroscience. So it is appropriate that last month, a group of prominent scientists, ethicists and lawyers gathered at the Asilomar conference centre in California to found the Neuroethics Society, which will address these issues.

This effort is to be applauded, but there is a lot of work to do before it can engage a rapidly expanding neuroscience community that has been relatively slow to recognize its own responsibility to address potential abuses of knowledge.

Geneticists held their own landmark Asilomar meeting more than thirty years ago to discuss the possible regulation of recombinant DNA. Today, the ethical and social repercussions of genetics are a standard component of undergraduate education. But ethics is a long way from attaining corresponding status within neuroscience. One prominent bioethicist reports that his own lecture on neuroethics was cancelled, falling victim to timetable pressure.

The arrival of No Lie MRI and Cephos suggests that fMRI is entering the 'real world', whether neuroscientists consider it ready or not.

Neuroscientists have reasons for their reluctance to wade into ethics. The questions raised are likely to be open-ended, and their arrival in the world outside the laboratory may be some way off. Whereas a genetic test can say something definitive about a particular genetic make-up, and therefore about predisposition to disease, for example, an fMRI scan is just an indirect measure of neural activity based on oxygenated blood flow. For now, neuroscientists have only the most basic grasp of what this says about how the brain processes information.

Even so, the arrival of No Lie MRI and Cephos suggests that fMRI is entering the 'real world', whether neuroscientists consider it ready or not. The community needs to broadcast its doubts about this situation from the rooftops — and prepare for a prolonged, complex and occasionally frustrating engagement with the public on the ethical ramifications of its work.

Extra navigation

.

naturejobs

ADVERTISEMENT