Several years ago, Tom Wolfe wrote an essay about the rise of cognitive neuroscience, which he entitled “Sorry, your soul just died”. Although reports of the soul's death may be premature, Wolfe was surely right in suggesting that advances in understanding how our brains work will pose an unprecedented challenge to our sense of who we are.

Last month, some of these challenges were explored at an unusual meeting hosted by the New York Academy of Science. “The Self: From Soul to Brain” brought together a range of experts in neuroscience, psychology, philosophy, theology and anthropology, to discuss the extent to which our sense of self can be explained in the language of neuroscience. A few years ago, a conference like this would have been considered a fringe event, but the questions it sought to address are now firmly within the mainstream of scientific inquiry.

Cognitive neuroscience is of course in its infancy, but there can be no disputing its relevance to understanding our sense of self; as philosopher Daniel Dennett memorably put it, a brain transplant is the only operation for which it's better to be the donor. Identifying the brain as the place to look is a good start, but the field is still a very long way from being able to answer the deep questions of human existence. Perhaps the most significant general conclusion to be drawn from the current state of knowledge is that our intuitions are a very poor guide to how our minds actually work. Most of us share a strong intuition that our own self is an irreducible whole, that there must be some place in our brains where our perceptions and thoughts all come together and where our future actions are decided. Yet this view is now known to be incorrect—different mental processes are mediated by different brain regions, and there is nothing to suggest the existence of any central controller. The study of split-brain patients reinforces the point, by showing that perception and action can be mediated independently by the two hemispheres; if for example two different visual stimuli are presented to the two hemispheres, the right hemisphere will direct an appropriate behavioral response to the stimulus it sees, while the left hemisphere, which controls language but has no access to the stimulus seen by the right hemisphere, will confabulate a plausible alternative explanation for the behavior, based on the unrelated stimulus to which it does have access. Clearly, the idea of a single locus of perception and decision-making is untenable.

If there is no single brain structure that embodies the self, how can the field progress, and indeed what questions should it be asking? Although there is no consensus yet, some critical issues are emerging, including the neural basis of perception and decision-making, the origin of our sense of free will, the storage and retrieval of autobiographical memories, the origin of ethical precepts, and the basis of self-awareness, including the ability to explain one's own actions. One particularly promising approach to this last question, highlighted by the condition of autism, may be to explore the basis of social cognition. Autism has been called 'mind-blindness', and is often characterized as a lack of insight into the minds of other people. But autistic people also show a lack of insight into their own minds, and it seems plausible that the two deficits are causally linked; understanding one's own mind may also be an important source of knowledge about other people. It is possible to ask what brain regions are specifically activated by tasks that require such insights, and recent evidence suggests that self-knowledge and knowledge of others both involve at least some of the same regions, notably the anterior cingulate cortex.

These questions are of more than purely intellectual interest. As discussed by Martha Farah in a commentary on page 1123 of this issue, our growing ability to understand and manipulate our own brains is creating a wide range of new ethical dilemmas that will require practical answers. How society resolves these questions will depend in large part on the public's conception of neuroscience and its ability to address questions about self and soul. There is an obvious parallel with the debate over stem cells and cloning. Among the general public, it is widely believed that the soul arises at conception, despite the obvious counter-arguments about twinning, nuclear transplantation and so forth. This discrepancy between what most biologists believe and what many lay people believe has led (particularly in the United States) to bitter public controversy, culminating in regulations for human stem cell research that most biologists regard as inappropriately restrictive.

It is understandable that people are drawn to a simple account in which the appearance of a new individual corresponds to the sharply defined event of fertilization. The alternative is to accept that the self is not an indivisible all-or-nothing entity, but that it instead emerges gradually, and that its origin must be explained in terms of the complexities of developmental neuroscience and psychology. That is not an easy message to convey even to highly educated people, so it is encouraging that an increasing number of prominent neuroscientists and psychologists are willing to make the attempt through popular books (including Joe LeDoux, the organizer of the NYAS conference, whose book The Synaptic Self is reviewed on page 1115). That our own identity can be dissected into its component parts, that these components can be studied separately, and that many of our intuitions about our own mental lives will prove wrong, these are revolutionary ideas that will require patient explanation, and we should not expect them to be accepted easily or quickly.