Article | Open Access | Published:

Brain response to affective pictures in the chimpanzee

Scientific Reports volume 3, Article number: 1342 (2013) | Download Citation

Abstract

Advancement of non-invasive brain imaging techniques has allowed us to examine details of neural activities involved in affective processing in humans; however, no comparative data are available for chimpanzees, the closest living relatives of humans. In the present study, we measured event-related brain potentials in a fully awake adult chimpanzee as she looked at affective and neutral pictures. The results revealed a differential brain potential appearing 210 ms after presentation of an affective picture, a pattern similar to that in humans. This suggests that at least a part of the affective process is similar between humans and chimpanzees. The results have implications for the evolutionary foundations of emotional phenomena, such as emotional contagion and empathy.

Introduction

Scientists once neglected the presence of emotions in nonhuman animals1. However, people sometimes provide emotional accounts of animal behaviors in ordinary conversation. Brain science has revealed homologous structures and circuits in human and other mammalian central nerve systems; thus it is reasonable to assume that emotional or affective processes are also shared on some level2,3. From an evolutionary perspective, emotion, or affective processing, is thought to be an effective system for generating a rapid, adaptive response to various environmental inputs2. For example, fear upon seeing a predator is often linked with flight and escape behavior, which is vital for survival.

Several studies of nonhuman animals have used physiological markers to make inferences about their affective states. Monkeys and apes show changes in heart rate, skin surface and tympanic membrane temperature, and skin conductance response after presentation of affective stimuli4,5,6,7,8,9,10. In addition, just as affective stimuli activate neural mechanisms that enhance human memory11, a chimpanzee had better memory for pictures of conspecifics with affective expressions than for those without such expressions, suggesting that humans and chimpanzees share a similar affective processing mechanism12.

In research on humans, advancements in non-invasive brain imaging techniques have allowed us to examine the brain areas and time course involved in processing affect13,14. Studies of scalp-surface potentials using electroencephalography (EEG) or event-related potentials (ERPs) are the best contemporary non-invasive measures for determining an accurate time course, which is important when considering the rapidity of affective processes14. However, no comparative data are available in nonhuman primates. To address this lack of information, we examined ERPs in response to affective pictures in a chimpanzee, the closest extant species to humans.

The present study is an integration of an earlier investigation of brain response in the chimpanzee with the memory study described above. After a long period of step-by-step training, we succeeded in measuring EEG in a fully-awake adult chimpanzee for the first time15,16,17. In the present experiment, we measured ERPs in response to pictures which were a subset of the stimuli used in the chimpanzee memory experiment, in which enhanced memory for affective over neutral pictures was found12. As there has been no similar study in nonhuman primates, the present study was exploratory in nature rather than a test of a specific hypothesis. In sum, the present study tested whether differential brain response to affective and neutral pictures could be found that coincided with evidence of enhanced memory of affective pictures.

Results

We obtained EEG at five scalp positions (Fz, Cz, Pz, T5, and T6). At all channels, the ERP waveforms elicited by affective and neutral pictures showed positive deflections peaking at around 150 ms post-stimulus (termed “P150”), followed by negative deflections peaking at around 250 ms (termed “N250”), and slow negative wave (termed “SNW”) to affective stimuli after 300 ms (Figure 1). The average peak latencies (average of all five channels across all four experimental sessions) were 151.4 ms for P150 and 258.6 ms for N250. Average amplitudes within the time windows of 135–165 ms, 200–300 ms, and 300–800 ms after stimulus onset were calculated for each electrode to represent P150, N250, and SNW, respectively, and the data for the affective and neutral pictures were compared. For P150, there were no significant differences in average amplitudes at any of the five channels: |t(297)| < 0.630, p > 0.52 for all channels). For N250, t-tests revealed significant differences at all five channels, indicating larger negative deflections for affective pictures: Cz, t(297) = 4.465, p = 1.1 × 10−5; Pz, t(297) = 5.006, p = 9.6 × 10−7; Fz, t(297) = 3.829, p = 1.6 × 10−4; T5, t(297) = 4.563, p = 7.4 × 10−6, T6, t(297) = 5.277, p = 2.5 × 10−7. For SNW, there were also significant differences at all channels except for Fz, reflecting sustained negative deflections which were dominant in responses to the affective pictures : Cz, t(297) = 4.366, p = 1.8 × 10−5; Pz, t(297) = 5.172, p = 4.3 × 10−7; Fz, t(297) = 3.256, p = 0.0013; T5, t(297) = 4.785, p = 2.7 × 10−6, T6, t(297) = 5.235, p = 3.1 × 10−7.

Figure 1: Average ERP waveforms to affective and neutral pictures at Fz, Cz, Pz, T5, and T6.
Figure 1

The lines indicated in thick red represent ERPs to affective pictures. The lines indicated in thin blue represent ERPs to neutral pictures.

It is possible that the results reflected other properties inherent in the stimulus set, such as the presence of faces or depictions of social interactions, rather than their affective properties. Therefore we conducted additional comparisons between the ERPs based on different picture categorizations (see Methods for additional explanation). As for the comparison of ERPs to the picture with a face versus the pictures without faces, there was no significant difference at any channel with respect to P150, N250, or SNW (|t(297)| < 1.728, p > 0.085). As for the comparison of ERPs to the picture depicting a grooming social interaction versus the pictures depicting no such social interaction, there was no significant difference at any channel with respect to P150, N250, or SNW (|t(297)| < 1.381, p > 0.168).

An additional analysis was conducted to examine the time at which differences emerged between the ERPs to affective and neutral pictures. T-tests revealed that EEGs to affective and neutral pictures differed significantly from 209 ms post-stimulus and onward at Cz, 208 ms post-stimulus and onward at Fz, 213 ms and onward at Pz, 211 ms post-stimulus and onward at T5, and 206 ms post-stimulus and onward at T6.

Discussion

ERPs to affective pictures obtained from the chimpanzee differed from those to neutral pictures approximately 210 ms post-stimulus. In humans, ERPs to affective stimuli are divided into three latency ranges: early (100–200 ms), middle (200–300 ms), and late (>300 ms)18. Some studies have shown affect-related ERP modulations in the early latency range in humans, but others have not18. In the present study, no differences were observed between the two categories of pictures in the early latency range. ERPs in this latency range are known to be influenced by perceptual characteristics such as complexity of pictures, colors, and spatial frequency. Further investigation of the role of perceptual characteristics is needed to evaluate the ERPs found in the early latency range in the current study against the results of other studies.

Negative deflection of ERPs to affective pictures within the middle latency range (200–300 ms) has been demonstrated in a number of studies with humans19,20. Our findings are similar to these results. Studies with humans suggest that the component in this latency range can be interpreted as reflecting amygdala processing of affective information21. Although it is necessary to consider the possibility that different components exist within this latency range, middle latency ERPs are believed to reflect selective attention to affective images of intrinsic relevance19,20. The present study suggests that this process is similar between humans and chimpanzees. Furthermore, the fact that the chimpanzee's brain responses to affective and neutral pictures differed 210 ms after the stimulus presentation and onward supports the results of a previous study in which a chimpanzee showed better memory for affective pictures than for neutral ones; a subset of these pictures were used in the present study.

Positive deflections of ERPs to affective stimuli in the late latency range (>300 ms) are a common finding among studies with human subjects18. Positivity within this latency range is hypothesized to reflect mental resource allocation and enhanced encoding for arousing stimuli. Our results with the chimpanzee showed differential ERPs in this latency range, but with different polarity from that of human data: negativity rather than positivity. One possible reason for this discrepancy is a difference in task demands during the measurement of ERPs. Most of the human studies that have shown positive components in the late latency range employed tasks in which participants were required to make overt responses to stimuli. By contrast, a study that employed a passive viewing paradigm, which is similar to that of our study, showed negative deflections to affective pictures in this latency range22. Therefore, the difference in task demand may account for the discrepancy, although a species difference in brain response and/or brain structure cannot be ruled out as a cause of the polarity reversal between the human and chimpanzee results.

As described above, we believe that the ERPs observed in the middle and late latency ranges reflect selective attentional processing of pictures of chimpanzees showing affective expressions. But because there were fewer affective pictures than neutral pictures (i.e., 3 versus 12), it is necessary to consider whether the difference in ERPs between affective and neutral pictures is truly related to the affective content of the picture or is merely the response to relatively novel stimuli. In the latter case, the response may be similar to the detection of deviance in a visual oddball task, such as mismatch negativity. However, we consider it unlikely that the results we obtained can be interpreted solely as a response to deviance, for the following two reasons. First, the procedure we employed is different from a typical oddball task, in which exactly the same stimulus is repeated and followed by an infrequent novel stimulus. In our study, a total of 15 different types of pictures were used, with an almost equal number of presentations of each picture, and the same picture was not repeated in two or more consecutive trials. Second, the pictures we used could be variously categorized, such that pictures of a certain category were fewer in number than those of the other category (i.e., full face vs. no face, and grooming social interaction vs. no social interaction). The results of the comparisons based on these different categorizations did not reveal a difference in the N250 and SNW components. Therefore, it is plausible to consider that the differences in the N250 and SNW components are related to the affective content of the picture, even if this was influenced by the response to deviance.

The pictures used in our experiment showed affective expressions of chimpanzees who were unfamiliar to the subject chimpanzee. In other words, the pictures did not depict objects that directly elicited the affection of the subject chimpanzee. Rather, these pictures depicted other individuals who were experiencing affective states. A phenomenon in which the affective expression of an individual influences the affective states of another individual can be categorized as emotional contagion, or empathy1. A recent hypothesis suggests that the same neural representations may be activated when a subject directly experiences an emotional event and when the subject watches another individual in a similar state. This may constitute the foundation for emotional contagion or empathy1,23. Our study represents a step forward in the understanding of the evolutionary basis of these phenomena.

Methods

Participant

The participant was a female chimpanzee (Pan troglodytes) named Mizuki. She lived with five conspecific group members at the Hayashibara Great Ape Research Institute, Okayama, Japan, in an enriched environment consisting of a 7400 m2 outdoor enclosure and several indoor areas. She was raised by human caregivers since shortly after birth. Since her arrival at the Great Ape Research Institute when she was 2 years and 1 month old, she has spent her time with other chimpanzees. At the time of the present experiments, Mizuki was 11 years old and had undergone other behavioral cognitive experiments24,25, as well as earlier ERP experiments15,16,17,26. This research was conducted in accordance with the Guide for the Care and Use of Laboratory Animals of Hayashibara Biochemical Laboratories, Inc., and the Weatherall report, The use of non-human primates in research. The research protocol was approved by the Animal Welfare and Animal Care Committee of Hayashibara Great Ape Research Institute (GARI-051101).

Apparatus and stimuli

The experimental room was 2 m wide, 2.9 m long, and 2.5 m high, and was surrounded by concrete walls and metal bars with a metallic mesh door. The room was moderately lighted during the experiment. The chimpanzee, Mizuki, sat on a wooden draining board covered with an electromagnetic shielding sheet on a concrete platform. An experimenter stood in front of her to keep her still and facing the display. Because of her close relationship of mutual trust with the human experimenters, Mizuki was totally cooperative during the recording sessions. Prior to testing, we took approximately 6 months to gradually habituate her to the experimental device and procedure, and throughout this habituation period, she exhibited no negative responses such as grimacing or screaming15.

A 17-inch CRT display (IIyama LA702U, 1024 × 768 pixels) was set up in front of Mizuki, approximately 40 cm away and at the horizontal level of her head. An infrared video camera was fixed on top of the CRT display to monitor the subject from a frontal view. We used this camera to check if the subject's gaze was directed to the stimulus display.

Experimental stimuli were color pictures of wild chimpanzees unfamiliar to Mizuki (Figure 2). These included three affective pictures of chimpanzees with affective expressions and 12 neutral pictures of chimpanzees without affective expressions. This set of stimuli was a subset of 80 pictures used in another study with a different chimpanzee, who had shown better performance in memory tasks for affective pictures than for neutral pictures12. Those pictures were still color images of wild chimpanzees from video recorded in Bossou, Guinea, West Africa, for research purposes27. The size of the pictures was 360 × 240 pixels. The average luminance of the pictures was set as equal. The pictures were displayed on a black background on the display.

Figure 2: Affective and neutral pictures used for the experiment.
Figure 2

Task procedure

An experimental session consisted of four or five blocks, and each block consisted of either 60–75 trials. The experimenter chose whether or not to use blocks that contained attention-getting stimuli (short animations 800 ms or 1600 ms in duration that were not related to the test pictures) in between trials, depending on behavioral signs of concentration to the monitor by the chimpanzee. As a result, attention-getting stimuli were absent from the first 3 blocks but present in all of the remaining 16 blocks. On each trial, one of a total of 15 stimulus pictures was presented for 800 ms in a semi-randomized order, in which an affective picture was followed by at least two different neutral pictures. The same picture was not repeated in two or more consecutive trials. Each picture presentation, as well as attention-getting stimulus, was followed by a 800 ms inter-stimulus interval consisting of an empty black screen.

Between blocks in each session, Mizuki was given a rest of 1 min, during which she was allowed to make substantial body movements and to receive fruit rewards. One session was conducted in a day, and four sessions were conducted on four different days. In total, Mizuki received 1201 trials in which affective or neutral pictures were presented (approximately 80 presentations for each picture; see Table 1), as well as 262 presentations of attention-getting stimuli of 144 different types, which were inserted between those test trials.

Table 1: The number of epochs used for analysis. The numbering of Picture IDs corresponds to the alignment of pictures of Figure 2: from left to right of the first row of Figure 2 (Affective 1, 2, 3), from left to right of the second row (Neutral 1, 2, and 3), and so forth

During the recordings, Mizuki's gaze occasionally appeared to avert from the monitor. When this occurred, another experimenter, who was monitoring the subject's gaze direction filmed on-line through the infrared camera, manually added a marker in the EEG data via a keyboard connected to the measurement computer.

ERP recording and analysis

The EEG was recorded from Ag/AgCl electrodes attached to five scalp positions (Fz, Cz, Pz, T5, and T6), according to the International 10–20 system for humans (Figure 3). The signals were referenced to the forehead midline (FPz). A ground electrode was positioned at the left earlobe. The electrodes were filled with Quick GEL and impedances were kept below 6 kΩ. Signals were amplified by NuAmp-40 and processed by Acquire 4.3 software (NeuroScan Inc.) with a 1000 Hz sampling rate. A 0.1–30 Hz band-pass filter (24 dB/oct) was applied in the offline analysis. All data were segmented into 900-ms epochs, including a 100-ms pre-stimulus baseline period, based on time markers of the stimulus onset. These epochs were baseline-corrected with respect to the mean amplitude over the 100-ms pre-stimulus period. Epochs where maximal difference of values exceeded 100 μV were excluded from analysis. Epochs that contained the “non-looking” markers described above were also excluded from analysis. The numbers of epochs accepted for analysis are shown in Table 1.

Figure 3: Chimpanzee participant, Mizuki, wearing electrodes.
Figure 3

Only four of the seven electrodes are visible (Fpz, Fz, Cz, and earlobe as a ground); the others are not visible (Pz, T5, and T6) because they are placed behind the vertex.

Independent t-tests were run on the data to compare the mean amplitudes for each time window of each channel separately. Time windows for analysis were chosen based on the averaged waveforms obtained, as described in the Results section. The t-tests assumed equality of variances, as homogeneity of variance was confirmed by Levene's Tests for Equality of Variance (F(2, 297) < 3.22, p > 0.07, for comparison of all pairs of groups). We initially chose an alpha level of 0.05, which was adjusted by use of the Bonferroni correction, yielding an adjusted threshold of significance of 0.001.

The main focus of the study was to compare ERPs to affective versus neutral pictures. However, it could be argued that the difference in ERPs may be attributable simply to a response to deviance, as the task could be regarded as an oddball task with a smaller number of affective pictures and a larger number of neutral pictures. It could also be the case that the results reflected other properties inherent in the stimuli set, such as the presence of faces or depictions of social interactions, rather than their affective properties. We therefore conducted the following additional comparisons. The first compared the picture without a visible chimpanzee face (Neutral 1, see Table 1 and Figure 1) to the pictures with chimpanzee faces (all of the other pictures). The second compared the picture depicting a grooming social interaction (Neutral 5, see Table 1 and Figure 1) to the pictures depicting no such social interaction (all of the other pictures).

References

  1. 1.

    What is animal emotion? Ann. N.Y. Acad. Sci. 1224, 191–206 (2011).

  2. 2.

    Affective Neuroscience: The Foundations of Human and Animal Emotions (Oxford University Press, Oxford, 1998).

  3. 3.

    The basic emotional circuits of mammalian brains: Do animals have affective lives? Neurosci. Biobehav. Rev. 35, 1791–1804 (2011).

  4. 4.

    , & Heart rate responses to social interactions in free-moving rhesus macaques (Macaca mulatta): a pilot study. J. Comp. Psychol. 113, 59–65 (1999).

  5. 5.

    & Specificity of the cardiac response to conspecific vocalizations in chimpanzees. Behav. Neurosci. 103, 235–245 (1989).

  6. 6.

    , , & Conspecific screams and laughter: cardiac and behavioral reactions of infant chimpanzees. Dev. Psychobiol. 22, 771–787 (1989).

  7. 7.

    & The use of nasal skin temperature measurements in studying emotion in macaque monkeys. Physiol. Behav. 102, 347–355 (2011).

  8. 8.

    , , & Behavioral triggers of skin conductance responses and their neural correlates in the primate amygdala. J. Neurophysiol. 101, 1749–54 (2009).

  9. 9.

    Cognitive and physiological markers of emotional awareness in chimpanzees (Pan troglodytes). Anim. Cogn. 4, 223–229 (2001).

  10. 10.

    & Brain temperature asymmetries and emotional perception in chimpanzees. Pan troglodytes. Physiol. Behav. 71, 363–371 (2000).

  11. 11.

    Emotion, cognition, and behavior. Science 298, 1191–1194 (2002).

  12. 12.

    , & Enhanced recognition of emotional stimuli in the chimpanzee (Pan troglodytes). Anim. Cogn. 11, 517–524 (2008).

  13. 13.

    , , & Functional neuroanatomy of emotion: a meta-analysis of emotion activation studies in PET and fMRI. Neuroimage 16, 331–348 (2002).

  14. 14.

    , & Human brain EEG indices of emotions: Delineating responses to affective vocalizations by measuring frontal theta event-related synchronization. Neurosci. Biobehav. Rev. 35, 1959–1970 (2011).

  15. 15.

    , , , , , , , , & Auditory ERPs to stimulus deviance in an awake chimpanzee (Pan troglodytes): towards hominid cognitive neurosciences. PLoS ONE 3, e1442. (2008).

  16. 16.

    , , , , , , , , & Brain activity in an awake chimpanzee in response to the sound of her own name. Biol. Lett. 6, 311–313 (2010).

  17. 17.

    , , , , , , , , & Neural correlates of face and object perception in an awake chimpanzee (Pan troglodytes) examined by scalp-surface event-related potentials. PLoS ONE 5, e13366 (2010).

  18. 18.

    , , & Affective picture processing: An integrative review of ERP findings. Biol. Psychol. 77, 247–265 (2008).

  19. 19.

    , , , , & Brain processes in emotional perception: motivated attention. Cogn. Emot. 18, 593–611 (2004).

  20. 20.

    , , & The selective processing of briefly presented affective pictures: an ERP analysis. Psychophysiol. 41, 441–449 (2004).

  21. 21.

    , , & Emotional expression boosts early visual processing of the face: ERP recording and its decomposition by independent component analysis. Neuroreport 12, 709–714 (2001).

  22. 22.

    , , , , , & Are we sensitive to valence differences in emotionally negative stimuli? Electrophysiological evidence from an ERP study. Neuropsychol. 45, 2764–2771 (2007).

  23. 23.

    & Empathy: its ultimate and proximate bases. Behav. Brain. Sci. 25, 1–72. (2002).

  24. 24.

    & Chimpanzees (Pan troglodytes) learn to act with other individuals in a cooperative task. Primates, 48, 13–21 (2007).

  25. 25.

    , , , & Facial perception of conspecifics: chimpanzees (Pan troglodytes) attend to proper orientation and open eyes. Anim. Cogn. 13, 679–688 (2010).

  26. 26.

    , , , , , , , , & Event-related potentials in response to subjects' own names: A comparison between humans and a chimpanzee. Commun. Integr. Biol. 4, 321–323 (2011).

  27. 27.

    , , & (eds) The Chimpanzees of Bossou and Nimba (Springer, Tokyo, 2011).

Download references

Acknowledgements

This study was supported by the Center for Evolutionary Cognitive Science at The University of Tokyo. This study was also financially supported by Grants-in-Aid for Scientific Research (grant numbers 18200018 to HK, 19300091 and 20002001 to MT, 20220004 to KF) and Grants-in-Aid for Specially Promoted Research (24000001 to TM) from JSPS.

Author information

Author notes

    • Satoshi Hirata

    Current address: Kumamoto Sanctuary, Wildlife Research Center, Kyoto University, 990 Otao, Misumi, Uki, Kumamoto 869-3201 Japan

Affiliations

  1. Great Ape Research Institute, Hayashibara Co. Ltd, Tamano, Okayama, Japan

    • Satoshi Hirata
    • , Koki Fuwa
    • , Keiko Sugama
    •  & Kiyo Kusunoki
  2. Graduate School of Arts and Science, University of Tokyo, Meguro, Tokyo, Japan

    • Goh Matsuda
    • , Kazuo Hiraki
    •  & Toshikazu Hasegawa
  3. School of Human Cultures, University of Shiga Prefecture, Hikone, Shiga, Japan

    • Ari Ueno
  4. Faculty of Sociology, Kansai University, Suita, Osaka, Japan

    • Hirokata Fukushima
  5. Primate Research Institute, Kyoto University, Inuyama, Aichi, Japan

    • Satoshi Hirata
    •  & Masaki Tomonaga

Authors

  1. Search for Satoshi Hirata in:

  2. Search for Goh Matsuda in:

  3. Search for Ari Ueno in:

  4. Search for Hirokata Fukushima in:

  5. Search for Koki Fuwa in:

  6. Search for Keiko Sugama in:

  7. Search for Kiyo Kusunoki in:

  8. Search for Masaki Tomonaga in:

  9. Search for Kazuo Hiraki in:

  10. Search for Toshikazu Hasegawa in:

Contributions

T.H., K.H., M.T., A.U., K.F., G.M. and S.H. conceived of the study. S.H., K.F., K.S. and K.K. conducted the experiment. G.M. analyzed the data. S.H., G.M. and H.F. wrote the paper. All authors edited the manuscript. All authors read and approved the final manuscript.

Competing interests

The authors declare no competing financial interests.

Corresponding author

Correspondence to Satoshi Hirata.

About this article

Publication history

Received

Accepted

Published

DOI

https://doi.org/10.1038/srep01342

Further reading

Comments

By submitting a comment you agree to abide by our Terms and Community Guidelines. If you find something abusive or that does not comply with our terms or guidelines please flag it as inappropriate.