Arising from: M. Koenigs et al. Nature 446, 908–911 (2007)10.1038/nature05631; Koenigs et al. reply
Neuroscience has recently turned to the study of utilitarian and non-utilitarian moral judgement. Koenigs et al.1 examine the responses of normal subjects and those with ventromedial–prefrontal–cortex (VMPC) damage to moral scenarios drawn from functional magnetic resonance imaging studies by Greene et al.2,3,4, and claim that patients with VMPC damage have an abnormally “utilitarian” pattern of moral judgement. It is crucial to the claims of Koenigs et al. that the scenarios of Greene et al. pose a conflict between utilitarian consequence and duty: however, many of them do not meet this condition. Because of this methodological problem, it is too early to claim that VMPC patients have a utilitarian bias.
Greene et al. reported that brain areas typically associated with affect are activated when subjects make moral judgements about ‘personal’ scenarios, where one alternative requires directly causing serious harm to persons. They found that in the minority, who judge such choices to be appropriate, areas associated with cognition and cognitive conflict are activated as well. On the basis of a later study reporting similar results in responses to ‘difficult’ personal scenarios, Greene suggested that the controversies between utilitarian and non-utilitarian views of morality “might reflect an underlying tension between competing subsystems in the brain”4, a claim taken up by leading ethicists5.
Koenigs et al. draw on the battery of moral scenarios of Greene et al. to compare normal subjects with six subjects who have focal bilateral damage to the VMPC, a brain region associated with the normal generation of emotions and, in particular, social emotions. They report that these patients “produce an abnormally ‘utilitarian’ pattern of judgements on [personal] moral dilemmas… In contrast, the VMPC patients’ judgements were normal in other classes of moral dilemmas”1. These claims are based on VMPC patients’ pattern of response to ‘high-conflict’ scenarios, a subset of personal scenarios on which normal subjects tended to disagree and that elicited greater response times.
However, the methodology used by Koenigs et al. cannot support claims about a utilitarian bias. Data from the categorization of the scenarios by five professional moral philosophers show that many are not of the required type. Only 45% of their impersonal scenarios and 48% of the personal ones were classified as involving a choice between utilitarian and non-utilitarian options. The distinction by Koenigs et al. between low- and high-conflict scenarios does not correspond to a difference in the scenarios’ content. The high-conflict scenarios are not all clear cases of utilitarian choice and some low-conflict ones are very clear cases of such choice: of the 13 high-conflict scenarios, our judges classified only eight as pure cases of utilitarian versus non-utilitarian choice; conversely, two low-conflict scenarios were classified as such.
The battery of personal scenarios is therefore not an adequate measure of utilitarian choice, and the distinction between low and high conflict reflects only a difference in behavioural response, rather than consistent differences in the content of the scenarios. Thus it is too early to claim that VMPC patients have a bias towards utilitarian judgement. Furthermore, whereas Koenigs et al. found that normal subjects rated personal scenarios as having significantly higher emotional salience than impersonal scenarios, they found no such significant difference between low- and high-conflict scenarios. So their proposal that an affective deficit explains the VMPC patients’ abnormal pattern of response to high-conflict scenarios is not clearly true. Similarly, it is unclear that this pattern of response is due to VMPC patients following “explicit social and moral norms”1, as their choices in high-conflict scenarios are contrary to familiar social norms to prevent harm.
In conclusion, to establish that a response pattern manifests a tendency to utilitarian moral judgement, the stimuli used need to be classified in terms of content and not by purely behavioural or emotional criteria as was done here and in other studies such as those of Greene et al.2,4,6.
Koenigs, M. et al. Damage to the prefrontal cortex increases utilitarian moral judgements. Nature 446, 908–911 (2007)
Greene, J. D., Sommerville, R. B., Nystrom, L. E., Darley, J. M. & Cohen, J. D. An fMRI investigation of emotional engagement in moral judgment. Science 293, 2105–2108 (2001)
Greene, J. D. & Haidt, J. How (and where) does moral judgment work? Trends Cogn. Sci. 6, 517–523 (2002)
Greene, J. D., Nystrom, L. E., Engell, A. D., Darley, J. M. & Cohen, J. D. The neural bases of cognitive conflict and control in moral judgment. Neuron 44, 389–400 (2004)
Singer, P. Ethics and intuitions. J. Ethics 9, 331–352 (2005)
Ciaramelli, E., Muccioli, M., Làdavas, E. & di Pellegrino, G. Selective deficit in personal moral judgment following damage to ventromedial prefrontal cortex. Social Cogn. Affect. Neurosci. 2, 84–89 (2007)
Rights and permissions
About this article
Cite this article
Kahane, G., Shackel, N. Do abnormal responses show utilitarian bias?. Nature 452, E5 (2008). https://doi.org/10.1038/nature06785
This article is cited by
How Cognitive Neuroscience Informs a Subjectivist-Evolutionary Explanation of Business Ethics
Journal of Business Ethics (2017)
Empirical Methods in Animal Ethics
Journal of Agricultural and Environmental Ethics (2015)
The Dual Track Theory of Moral Decision-Making: a Critique of the Neuroimaging Evidence
Koenigs et al. reply
By submitting a comment you agree to abide by our Terms and Community Guidelines. If you find something abusive or that does not comply with our terms or guidelines please flag it as inappropriate.