Skip to main content

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

  • Article
  • Published:

Shared yet dissociable neural codes across eye gaze, valence and expectation

Abstract

The direction of the eye gaze of others is a prominent social cue in primates and is important for communication1,2,3,4,5,6,7,8,9,10,11. Although gaze can signal threat and elicit anxiety6,12,13, it remains unclear whether it shares neural circuitry with stimulus value. Notably, gaze not only has valence, but can also serve as a predictor of the outcome of a social encounter, which can be either negative or positive2,8,12,13. Here we show that the neural codes for gaze and valence overlap in primates and that they involve two different mechanisms: one for the outcome and another for its expectation. Monkeys participated in the human intruder test13,14, in which a human participant had either a direct or averted gaze, interleaved with blocks of aversive and appetitive conditioning. We find that single neurons in the amygdala encode gaze15, whereas neurons in the anterior cingulate cortex encode the social context16, but not gaze. We identify a shared population in the amygdala for which the neural responses to direct and averted gaze parallel the responses to aversive and appetitive stimulus, respectively. Furthermore, we distinguish between two neural mechanisms—an overall-activity scheme that is used for gaze and the unconditioned stimulus, and a correlated-selectivity scheme that is used for gaze and the conditioned stimulus. These findings provide insights into the origins of the neural mechanisms that underlie the computations of both social interactions and valence, and could help to shed light on mechanisms that underlie social anxiety and the comorbidity between anxiety and impaired social interactions.

This is a preview of subscription content, access via your institution

Access options

Buy this article

Prices may be subject to local taxes which are calculated during checkout

Fig. 1: Behaviour during the HIT and affective-conditioning blocks.
Fig. 2: The amygdala encodes gaze and valence, and the ACC mainly encodes valence.
Fig. 3: Shared coding for valence and gaze in amygdala neurons.
Fig. 4: An overall-activity coding for eye gaze and US valence and a correlated-selectivity coding for eye gaze and CS valence.

Similar content being viewed by others

Data availability

All data supporting the findings of this study are available from the corresponding author upon reasonable request.

Code availability

Custom code for behavioural and electrophysiological tests is available from the corresponding author upon reasonable request.

References

  1. Jones, W. & Klin, A. Attention to eyes is present but in decline in 2–6-month-old infants later diagnosed with autism. Nature 504, 427–431 (2013).

    Article  ADS  CAS  Google Scholar 

  2. Emery, N. J. The eyes have it: the neuroethology, function and evolution of social gaze. Neurosci. Biobehav. Rev. 24, 581–604 (2000).

    Article  CAS  Google Scholar 

  3. Gobel, M. S., Kim, H. S. & Richardson, D. C. The dual function of social gaze. Cognition 136, 359–364 (2015).

    Article  Google Scholar 

  4. Adolphs, R. Neural systems for recognizing emotion. Curr. Opin. Neurobiol. 12, 169–177 (2002).

    Article  CAS  Google Scholar 

  5. Zhou, Y. et al. Atypical behaviour and connectivity in SHANK3-mutant macaques. Nature 570, 326–331 (2019).

    Article  ADS  CAS  Google Scholar 

  6. Schneier, F. R., Kent, J. M., Star, A. & Hirsch, J. Neural circuitry of submissive behavior in social anxiety disorder: a preliminary study of response to direct eye gaze. Psychiatry Res. 173, 248–250 (2009).

    Article  Google Scholar 

  7. Schneier, F. R., Rodebaugh, T. L., Blanco, C., Lewin, H. & Liebowitz, M. R. Fear and avoidance of eye contact in social anxiety disorder. Compr. Psychiatry 52, 81–87 (2011).

    Article  Google Scholar 

  8. Ballesta, S. & Duhamel, J. R. Rudimentary empathy in macaques’ social decision-making. Proc. Natl Acad. Sci. USA 112, 15516–15521 (2015).

    Article  ADS  CAS  Google Scholar 

  9. Gariépy, J. F. et al. Social learning in humans and other animals. Front. Neurosci. 8, 58 (2014).

    Article  Google Scholar 

  10. Rutishauser, U. et al. Single-neuron correlates of atypical face processing in autism. Neuron 80, 887–899 (2013).

    Article  CAS  Google Scholar 

  11. Hadjikhani, N. et al. Look me in the eyes: constraining gaze in the eye-region provokes abnormally high subcortical activation in autism. Sci. Rep. 7, 3163 (2017).

    Article  ADS  Google Scholar 

  12. Shepherd, S. V. Following gaze: gaze-following behavior as a window into social cognition. Front. Integr. Neurosci. 4, 5 (2010).

    PubMed  PubMed Central  Google Scholar 

  13. Kalin, N. H. & Shelton, S. E. Defensive behaviors in infant rhesus monkeys: environmental cues and neurochemical regulation. Science 243, 1718–1721 (1989).

    Article  ADS  CAS  Google Scholar 

  14. Oler, J. A. et al. Amygdalar and hippocampal substrates of anxious temperament differ in their heritability. Nature 466, 864–868 (2010).

    Article  ADS  CAS  Google Scholar 

  15. Mosher, C. P., Zimmerman, P. E. & Gothard, K. M. Neurons in the monkey amygdala detect eye contact during naturalistic social interactions. Curr. Biol. 24, 2459–2464 (2014).

    Article  CAS  Google Scholar 

  16. Haroush, K. & Williams, Z. M. Neuronal prediction of opponent’s behavior during cooperative social interchange in primates. Cell 160, 1233–1245 (2015).

    Article  CAS  Google Scholar 

  17. Gamer, M. & Büchel, C. Amygdala activation predicts gaze toward fearful eyes. J. Neurosci. 29, 9123–9126 (2009).

    Article  CAS  Google Scholar 

  18. Gothard, K. M., Battaglia, F. P., Erickson, C. A., Spitler, K. M. & Amaral, D. G. Neural responses to facial expression and face identity in the monkey amygdala. J. Neurophysiol. 97, 1671–1683 (2007).

    Article  CAS  Google Scholar 

  19. Adolphs, R. What does the amygdala contribute to social cognition? Ann. NY Acad. Sci. 1191, 42–61 (2010).

    Article  ADS  Google Scholar 

  20. Stein, M. B. & Stein, D. J. Social anxiety disorder. Lancet 371, 1115–1125 (2008).

    Article  Google Scholar 

  21. Tovote, P., Fadok, J. P. & Lüthi, A. Neuronal circuits for fear and anxiety. Nat. Rev. Neurosci. 16, 317–331 (2015).

    Article  CAS  Google Scholar 

  22. Herry, C. & Johansen, J. P. Encoding of fear learning and memory in distributed neuronal circuits. Nat. Neurosci. 17, 1644–1654 (2014).

    Article  CAS  Google Scholar 

  23. Duvarci, S. & Pare, D. Amygdala microcircuits controlling learned fear. Neuron 82, 966–980 (2014).

    Article  CAS  Google Scholar 

  24. Janak, P. H. & Tye, K. M. From circuits to behaviour in the amygdala. Nature 517, 284–292 (2015).

    Article  ADS  CAS  Google Scholar 

  25. Putnam, P. T. & Gothard, K. M. Multidimensional neural selectivity in the primate amygdala. eNeuro 6, ENEURO.0153-19.2019 (2019).

    Article  Google Scholar 

  26. Kyriazi, P., Headley, D.B. & Pare, D. Multi-dimensional coding by basolateral amygdala neurons. Neuron 99, 1315–1328 (2018).

    Article  CAS  Google Scholar 

  27. Pryluk, R., Kfir, Y., Gelbard-Sagiv, H., Fried, I. & Paz, R. A tradeoff in the neural code across regions and species. Cell 176, 597–6098 (2019).

    Article  CAS  Google Scholar 

  28. Munuera, J., Rigotti, M. & Salzman, C. D. Shared neural coding for social hierarchy and reward value in primate amygdala. Nat. Neurosci. 21, 415–423 (2018).

    Article  CAS  Google Scholar 

  29. Dunbar, R. I. M. The social brain hypothesis. Evol. Anthropol. 6, 178–190 (1998).

    Article  Google Scholar 

  30. Myllyneva, A., Ranta, K. & Hietanen, J. K. Psychophysiological responses to eye contact in adolescents with social anxiety disorder. Biol. Psychol. 109, 151–158 (2015).

    Article  Google Scholar 

  31. Bickart, K. C., Wright, C. I., Dautoff, R. J., Dickerson, B. C. & Barrett, L. F. Amygdala volume and social network size in humans. Nat. Neurosci. 14, 163–164 (2011).

    Article  CAS  Google Scholar 

  32. Sallet, J. et al. Social network size affects neural circuits in macaques. Science 334, 697–700 (2011).

    Article  ADS  CAS  Google Scholar 

  33. Dal Monte, O., Chu, C. C. J., Fagan, N. A. & Chang, S. W. C. Specialized medial prefrontal–amygdala coordination in other-regarding decision preference. Nat. Neurosci. 23, 565–574 (2020).

    Article  CAS  Google Scholar 

  34. Grabenhorst, F., Baez-Mendoza, R., Genest, W., Deco, G. & Schultz, W. Primate amygdala neurons simulate decision processes of social partners. Cell 177, 986–998 (2019).

    Article  CAS  Google Scholar 

  35. Allsop, S.A. et al. Corticoamygdala transfer of socially derived information gates observational learning. Cell 173, 1329–1342 (2018).

    Article  CAS  Google Scholar 

  36. Li, D., Babcock, J. & Parkhurst, D. J. openEyes: a low-cost head-mounted eye-tracking solution. In Proc. 2006 symposium on Eye tracking research & applications 95–100 (ACM, 2006).

  37. Mitz, A. R., Chacko, R. V., Putnam, P. T., Rudebeck, P. H. & Murray, E. A. Using pupil size and heart rate to infer affective states during behavioral neurophysiology and neuropsychology experiments. J. Neurosci. Methods 279, 1–12 (2017).

    Article  Google Scholar 

  38. Meyers, E. M., Freedman, D. J., Kreiman, G., Miller, E. K. & Poggio, T. Dynamic population coding of category information in inferior temporal and prefrontal cortex. J. Neurophysiol. 100, 1407–1419 (2008).

    Article  Google Scholar 

Download references

Acknowledgements

We thank Y. Kfir for scientific and technical advice; E. Kahana and N. Samuel for medical and surgical procedures; D. Goldin for engineering design; and E. Furman-Haran and F. Attar for MRI procedures. This work was supported by ISF 2352/19 and ERC-2016-CoG 724910 grants to R. Paz.

Author information

Authors and Affiliations

Authors

Contributions

R. Pryluk and R. Paz designed the study. R. Pryluk performed all experiments. Y.S., A.H.T. and A.M. contributed to experiments. R. Pryluk developed the methods and analysed the data. A.M. and D.F. contributed to data analysis and edited the manuscript. R. Pryluk and R. Paz wrote the manuscript.

Corresponding author

Correspondence to Rony Paz.

Ethics declarations

Competing interests

The authors declare no competing interests.

Additional information

Peer review information Nature thanks Jean-René Duhamel, Ziv M. Williams and the other, anonymous, reviewer(s) for their contribution to the peer review of this work.

Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Extended data figures and tables

Extended Data Fig. 1 Differential behavioural response to EC and NEC.

a, Same format as in Fig. 1f but for the all-shutter ROI (and not just face ROI). As can be seen, the monkeys looked at the face and eyes ROI mainly in the interactions with the human intruder. Left, the gaze density during all the sessions. b, Same format as in Fig. 1g (left), but aligned to the first time the monkeys looked to the ROI of the intruder’s eyes in each trial separately. c, Same format as in Fig. 1g (right) separately for each monkey.

Extended Data Fig. 2 Extracting differences in facial expression.

a, Examples of three original frames with different expressions, corresponding to the schematic in Fig. 1h. b, For every recording session, we averaged over all frames from the baseline period resulting in the mean image. Baseline was taken over the period before any trial when the monkey was alone in the room with a closed shutter. c, An example of a frame during the EC interaction. d, The mean frame (b) is subtracted from the frame in c during the interaction, to obtain a ‘diff’ (or delta) image. Three ROIs were defined manually for every day: upper face, ears and lower face. e, The r.m.s. of every ROI is calculated. Data are mean ± s.e.m. The differences between EC and NEC in the upper part of the face are shown. The other parts and ROIs are shown in Fig. 1. The black line represents a significant difference; P < 0.05, two-sided Student’s t-test; n = 1,480 trials in NEC and 1,628 trials EC).

Extended Data Fig. 3 Reversing valence directionality from NEC–EC to aversive–appetitive.

a, Same format as in Fig. 3a, b. The population decoding accuracy is shown when training on eye gaze (NEC versus EC) and testing on valence (aversive versus appetitive), using CS-related activity. Data are mean ± s.d.; n = 1,000 bootstrap replicates, n = 203 amygdala neurons. b, Same as in a but using US-related activity.

Extended Data Fig. 4 Single-neuron activity across conditions.

a, If overall-activity drives the successful decoding in the US epoch, we expected to find an overall change in the firing rate (increase or decrease) for gaze and US valence. Indeed, we found that there were more valence-positive neurons (increased firing rate to in response to the air puff) in the amygdala in the US epoch, and that there were more gaze-positive neurons (increased firing rate in response to EC) in the amygdala. Inset graphs show mean ± s.e.m.; ***P < 1× 10−3; Z-test; n = 203 amygdala neurons, n = 356 ACC neurons. b, Decoding accuracy with and without neurons that encode gaze. Black and red lines represent the mean and median, respectively; n = 203 amygdala neurons, n = 1,000 bootstrap replicates.

Extended Data Fig. 5 Decoding with trial-based alignment to shutter opening.

a, Same format as in Fig. 4h–j. Population decoding accuracy for real and shuffled amygdala neurons. Black and red lines represent the mean and median, respectively; n = 203 amygdala neurons. b, Same as in a for ACC activity. n = 356 ACC neurons. c, Cumulative distribution of the difference in decoding accuracy between real and shuffled neurons. ***P < 1× 10−3; two-sample Kolmogorov–Smirnov test; n = 203 amygdala neurons.

Extended Data Fig. 6 Behavioural differences between EC and NEC do not underlie neural findings.

a, An example of vocalizations during one HIT trial, measured using a microphone placed in close proximity to the monkey (Methods). Inset, the proportion of trials in which vocalizations occur. There was a very small proportion of trials in which vocalization occurred, and it was similar across EC and NEC trials. χ2 test; P = 0.88; n = 1,738 NEC trails and 1,807 EC trials. Owing to the low number of vocalizations, we were not able to characterize different types of vocalizations. In addition, we repeated analyses after removing trials during which vocalizations occur, and the main results were unchanged. b, An example of movement in one trial in response to the human intruder, measured using an accelerometer attached to the chair of the monkey (Methods). This also occurred during a small number of trials, and it was similar across EC and NEC trials. In addition, we repeated analyses after removing these trials, and the main results were unchanged. c, The overall change in facial expressions between EC and NEC (as in Fig. 1i). The r.m.s. of the change between the image over the whole face (main) and only for the lower half of the face (inset), compared to the neutral expression obtained from averaging over the baseline period when the monkey was alone (Methods). Data are mean ± s.e.m. There is a significant difference (P < 0.05); two-sided Student’s t-test; n = 1,703 NEC trials and 1,765 EC trials. d, Same as in c after applying the thinning method (iteratively selecting trials to obtain a similar distribution of behaviour across EC and NEC; Methods). We applied the same method also to eye movements. e, Decoding accuracy using only trials with similar behaviour across EC and NEC, taken after thinning as described in d. The results remain the same (compare to Fig. 4h). Red and black lines indicate the median and mean, respectively; n = 203 amygdala neurons; n = 1,000 bootstrap replicates.

Extended Data Fig. 7 Consistency across stimulus saliency (no within-day adaptation).

ad, Decoding accuracy was divided into the first and second half of trials and similar results were obtained. The presentation is a merged format of Fig. 4h, i. CS-related activity (a, b) or US-related activity (c, d) in the first half (a, c) and second half (b, d) of trials is shown. Red and black lines indicate the median and mean, respectively; n = 203 amygdala neurons, n = 356 ACC neurons; n = 1,000 bootstrap replicates.

Extended Data Fig. 8 Neuronal modulation.

a, Left, we divided the amygdala neurons into three groups: the first contains neurons that increased their firing rate (FR) in response to gaze and valence (61/203, positive β values in Fig. 4a); the second group decreased FR in response to both gaze and valence (65/203, negative β values in Fig. 4a); and the third group increased FR in response to one condition and decreased in response to the other (77/203). For the first two groups, the decoding accuracy of valence based on gaze (similar analysis as in Fig. 4h for CS-related activity) was significantly higher than chance, indicating that the overall result reported in the main text is based on both increases and decreases in FR. Right, same but for ACC neurons. Red and black lines indicate the median and mean, respectively. b, Amygdala neurons were sorted according to degree of modulation (magnitude of βgaze × βvalence). The decoding accuracy (mean) and its variance for increasing group size (that is, 10 neurons with highest modulation, 20 neurons… and so on) was recalculated. These values were compared to randomly chosen groups of similar size (green inset, notice the linear increase). The decoding accuracy increased until reaching a group size of 120–130 neurons; which is the number of neurons that contained the first two groups from a that either increased or decreased the firing rate, (but not mixed activity). Bottom, the proportion of neurons from the two groups. It can be seen that both groups contributed to the increased accuracy. These results further support the conclusion that the shared neural mechanisms are not due only to an increased firing rate, as an indication of saliency or alertness. n = 203 amygdala neurons.

Extended Data Fig. 9 Neurons encode species, but this coding is not shared with the valence of CS.

a, We included monkey-intruder blocks (top) in a similar way to the HIT trials (bottom). The same neurons reported in the main analysis were recorded during the monkey–monkey interactions. For each recording session, on average two (out of six) monkeys served as intruders. All of the monkeys had lived together for several years. b, Neurons in the amygdala (n = 203), as well as in the ACC (n = 356), code for species and can differentiate between human and monkey intruders. Moreover, neurons differentiate between NEC human trials and monkey-intruder trials. Data are mean and s.e.m. c, In contrast to the findings in Fig. 4a, there was no significant correlation (Pearson’s correlation, r = 0.05, P = 0.45, n = 203) between βspecies and βCSvalence, strongly arguing against a correlated-selectivity mechanism between species and CS. d, Decoding accuracy of CS valence after training the decoder to differentiate species, was not different from chance level and significantly smaller than the decoding accuracy of CS valence based on gaze. Red and black lines indicate the median and mean, respectively. n = 203 amygdala neurons; n = 1,000 bootstrap replicates. e, Differences in heart-rate variability between monkey-intruder and NEC trials (as shown for EC and NEC trials (Fig. 1k)). *P < 0.05; two-sided Student’s t-test; n = 1,703, 1,765 and 1,620 trials in NEC, EC and monkey trials. f, Despite differences in heart-rate variability (e), the findings in d remained similar when using either only EC or only NEC trials of the human intruder (n = 203 amygdala neurons). Red and black lines indicate the median and mean, respectively.

Extended Data Fig. 10 NEC trials are different from neutral trials.

a, We included neutral trials, in which the opening of the shutter (CS) is followed by nothing. b, The heart rate was significantly lower in neutral trials compared to all others types of trial and, specifically, it was lower than in NEC trials. Top left, difference in heart rate, same as in Fig. 1. j. Top right, difference in heart rate in the control days that included neutral trials, showing the same trend for all types of trial and no modulation for neutral trials. Together, these results indicate that the NEC trials were not salience-free, but rather highly salient in a different manner than the EC trials. n = 1,703, 1,765, 1,620, 1,352 and 712 trials for NEC, EC, monkey, reward and air puff trails, respectively. Data are mean ± s.e.m.; ***P < 1× 10−3; two-sided Student’s t-test.

Supplementary information

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Pryluk, R., Shohat, Y., Morozov, A. et al. Shared yet dissociable neural codes across eye gaze, valence and expectation. Nature 586, 95–100 (2020). https://doi.org/10.1038/s41586-020-2740-8

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1038/s41586-020-2740-8

Comments

By submitting a comment you agree to abide by our Terms and Community Guidelines. If you find something abusive or that does not comply with our terms or guidelines please flag it as inappropriate.

Search

Quick links

Nature Briefing

Sign up for the Nature Briefing newsletter — what matters in science, free to your inbox daily.

Get the most important science stories of the day, free in your inbox. Sign up for Nature Briefing