News & Views | Published:

Cognitive science

Staring fear in the face

The unusual case of SM, a person who has a very specific deficit in recognizing fearful expressions on people's faces, is providing intriguing insights into how we perceive emotion.

Charles Darwin thought that the ability of humans to display and perceive emotional states on a face evolved to convey non-verbal signals rapidly1. If an individual's expression could communicate a potential threat, for example, his neighbours would be able to respond quickly and direct their attention to the source of the danger. Thus, a common view is that the perception of fear might guide appropriate visuomotor behaviour2. In a striking reversal of this perspective, work by Adolphs et al. on page 68 of this issue3 suggests that discerning fear in faces may depend on how one scrutinizes them in the first place.

The authors describe a patient (SM) who has bilateral brain lesions in the amygdala, a region of the medial temporal lobe known to be critical for the perception of fear4. SM cannot recognize fear from facial expressions5, and Adolphs et al. show that this is because she fails to look spontaneously towards the eyes on a face. When shown a face displaying an unmistakable expression of terror, she tends to fixate unworriedly on the nose and mouth regions, neglecting to notice the wide, scared eyes. Thus, she erroneously judges that the face has a neutral expression. By contrast, normal people always look immediately at the eye region of a face, and all the more so when the face is fearful6.

SM avoids the eyes of all faces, no matter what their expression. But, remarkably, only her perception of fear is impaired — she can recognize other emotions. This suggests that visual cues provided by the eyes are particularly critical for the recognition of fear; other facial emotions can presumably be recognized without looking at the eyes (happiness can be inferred from a smile, for example). SM was also tested on a ‘bubble’ visual task7, in which she had to discriminate between fearful and happy faces seen through apertures that revealed only small parts of the image. This allowed the investigators to determine which region of the face she used to distinguish the expressions. Again unlike normal individuals, SM failed to use information from the eye area, but she could still take cues from around the mouth.

Most surprisingly, simply instructing SM to “look at the eyes” could restore normal recognition of fearful expressions, indicating that she still knows what fear ‘looks like’ but seems unable to notice scared eyes when she is not prompted to look at them. This ‘rescue’ was short-lived, however, and SM needed to be reminded continually to look at the eyes. These new results unexpectedly reveal that the damage to the amygdala might impair attention and exploration strategies, rather than causing a perceptual deficit affecting the visual analysis or categorization of specific facial traits.

Much recent research has focused on the role of the human amygdala in fear recognition. Numerous brain-imaging studies confirm that the human amygdala responds more to fearful faces than to faces expressing other emotions, but the exact function of the amygdala during recognition of facial expressions remains a mystery. Initially, the observation that SM's perception of fear is impaired while her recognition of other emotions remains intact5 was thought to support the idea that different categories of emotion involve distinct neural circuits in the brain8. The findings of Adolphs et al.3 now suggest a very different mechanism, perhaps involving a more general role for the amygdala in modulating visual and attentional processing9. The amygdala is known to be sensitive to perceived gaze direction, responding most when the eyes in a facial image seem to be looking at the observer10. In agreement with Darwin's theory, it makes sense if fear perception is intimately connected with locating the threat that fearful eyes are seeing11. The simplicity of such a mechanism might allow for swift responses to danger, even with poor or crude inputs, or during inattention. Indeed, it was recently found that when a normal subject is shown shapes that look like the whites of a pair of eyes, his amygdala responds more to larger shapes (corresponding to wide, fearful eyes) than to small (happy) shapes12.

However, the amygdala is probably not just an ‘eye detector’, and perception of fearful expressions is unlikely to rely solely on wide eyes. Previous research13,14,15 suggests that processing single ‘diagnostic’ features in faces is not sufficient to appraise their expression fully, but that more global configural information is important (for example, see the composite faces in Fig. 1). Moreover, the bubble task might induce a bias to use the local details visible through the bubble apertures rather than configural information, which would be more natural16, particularly in a dichotomous fearful–happy classification task (for instance, SM might simply check for the presence of a smile, and therefore never need to look at the eyes to perform this particular task). The demands of particular tasks also influence whether the subject uses local or global visual features during face processing17. Furthermore, brain-imaging data indicate that even though the amygdala might respond to fearful eyes when they are presented alone, it is activated most in response to whole faces18.

Figure 1: How the eyes contribute to facial expressions of fear.
figure1

a, b, Examples of fearful (a) and happy (b) faces from a standard image set19. c, d, Composites using the top and bottom halves from the same faces with fearful eyes but happy mouth (c) or happy eyes but fearful mouth (d). As can be seen, the global configuration in composites is affecting the perceived emotion expressed by the face. A study by Adolphs et al.3 sheds new light on how the amygdala in the brain is involved in processing the eyes in such expressions.

Finally, it remains to be determined whether SM's attention to other facial features is normal (only her response to the eye region was recorded), and to explain why she can still recognize expressions of sadness or anger in which eye information is important (normal subjects find it more difficult to recognize these emotions when the eyes are erased)3.

The intriguing implications of these new findings need to be explored. What are the neural circuits by which the amygdala might guide eye scan-paths? How does SM judge expressions in composite faces such as those in Figure 1? How does she perform on more implicit tests of fear recognition, or using graded rather than dichotomous measures? Does she orient her eyes normally to emotional visual stimuli other than faces, and to emotional voices? What is the amygdala's normal role in exploring social situations and looking at other people, and are these mechanisms altered in diseases such as phobias or autism that are thought to involve the amygdala? We are just beginning to realize how the brain processes emotionally relevant cues in the environment, and the unusual features of SM will provide much food for future thought.

References

  1. 1

    Darwin, C. The Expression of Emotion in Man and Animals (Oxford Univ. Press, 1872).

  2. 2

    Ohman, A. Psychophysiology 23, 123–145 (1986).

  3. 3

    Adolphs, R. et al. Nature 433, 68–72 (2005).

  4. 4

    LeDoux, J. E. Annu. Rev. Neurosci. 23, 155–184 (2000).

  5. 5

    Adolphs, R., Tranel, D., Damasio, H. & Damasio, A. Nature 372, 669–672 (1994).

  6. 6

    Yarbus, A. L. Eye Movement and Vision (Plenum, New York, 1967).

  7. 7

    Gosselin, F. & Schyns, P. G. Vision Res. 41, 2261–2271 (2001).

  8. 8

    Calder, A., Lawrence, A. & Young, A. Nature Rev. Neurosci. 2, 352–363 (2001).

  9. 9

    Vuilleumier, P., Richardson, M., Armony, J., Driver, J. & Dolan, R. J. Nature Neurosci. 7, 1271–1278 (2004).

  10. 10

    Kawashima, R. et al. Brain 122, 779–783 (1999).

  11. 11

    Sander, D., Grafman, J. & Zalla, T. Rev. Neurosci. 14, 303–316 (2003).

  12. 12

    Whalen, P. J. et al. Science 306, 2061 (2004).

  13. 13

    McKelvie, S. J. Br. J. Social Psychol. 34, 325–334 (1995).

  14. 14

    Calder, A. J., Young, A. W., Keane, J. & Dean, M. J. Exp. Psychol. Hum. Percept. Perform. 26, 527–551 (2000).

  15. 15

    Prkachin, G. C. Br. J. Psychol. 94, 45–62 (2003).

  16. 16

    Murray, R. F. & Gold, J. M. Vision Res. 44, 461–470 (2004).

  17. 17

    Schyns, P. G. & Oliva, A. Cognition 69, 243–265 (1999).

  18. 18

    Morris, J. S., deBonis, M. & Dolan, R. J. Neuroimage 17, 214–222 (2002).

  19. 19

    Ekman, P. & Friesen, W. Pictures of Facial Affect (Consulting Psychologists Press, Palo Alto, 1976).

Download references

Author information

Rights and permissions

Reprints and Permissions

About this article

Further reading

Comments

By submitting a comment you agree to abide by our Terms and Community Guidelines. If you find something abusive or that does not comply with our terms or guidelines please flag it as inappropriate.