Skip to main content

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

  • Article
  • Published:

Population coding of affect across stimuli, modalities and individuals

Abstract

It remains unclear how the brain represents external objective sensory events alongside our internal subjective impressions of them—affect. Representational mapping of population activity evoked by complex scenes and basic tastes in humans revealed a neural code supporting a continuous axis of pleasant-to-unpleasant valence. This valence code was distinct from low-level physical and high-level object properties. Although ventral temporal and anterior insular cortices supported valence codes specific to vision and taste, both the medial and lateral orbitofrontal cortices (OFC) maintained a valence code independent of sensory origin. Furthermore, only the OFC code could classify experienced affect across participants. The entire valence spectrum was represented as a collective pattern in regional neural activity as sensory-specific and abstract codes, whereby the subjective quality of affect can be objectively quantified across stimuli, modalities and people.

This is a preview of subscription content, access via your institution

Access options

Rent or buy this article

Prices vary by article type

from$1.95

to$39.95

Prices may be subject to local taxes which are calculated during checkout

Figure 1: Parametric modulation analysis (univariate) for independent ratings of positive and negative valence.
Figure 2: Representational geometry of multi-voxel activity patterns in EVC, VTC and OFC.
Figure 3: Population coding of visual, object and affect properties of visual scenes.
Figure 4: Region-specific population coding of visual features, object animacy and valence in visual scenes.
Figure 5: Visual, gustatory and cross-modal affect codes.
Figure 6: Cross-participant classification of items and affect.

Similar content being viewed by others

References

  1. Wundt, W. Grundriss der Psychologie, von Wilhelm Wundt (W. Engelmann, Leipzig, 1897).

  2. Penfield, W. & Boldrey, E. Somatic motor and sensory representation in the cerebral cortex of man as studies by electrical stimulation. Brain 60, 389–443 (1937).

    Article  Google Scholar 

  3. Huth, A.G., Nishimoto, S., Vu, A.T. & Gallant, J.L. A continuous semantic space describes the representation of thousands of object and action categories across the human brain. Neuron 76, 1210–1224 (2012).

    Article  CAS  Google Scholar 

  4. Haxby, J.V. et al. Distributed and overlapping representations of faces and objects in ventral temporal cortex. Science 293, 2425–2430 (2001).

    Article  CAS  Google Scholar 

  5. Kanwisher, N., McDermott, J. & Chun, M.M. The fusiform face area: a module in human extrastriate cortex specialized for face perception. J. Neurosci. 17, 4302–4311 (1997).

    Article  CAS  Google Scholar 

  6. Kriegeskorte, N., Mur, M. & Bandettini, P. Representational similarity analysis - connecting the branches of systems neuroscience. Front. Syst. Neurosci. 2, 4 (2008).

    Article  Google Scholar 

  7. Hinton, G.E., McClelland, J.L. & Rumelhart, D.E. Distributed representations. in Parallel Distributed Processing: Explorations in the Microstructure of Cognition (eds. Rumelhart, D.E. & McClelland, J.L.) 77–109 (The MIT Press, Cambridge, Massachusetts, 1986).

  8. Lewis, P.A., Critchley, H.D., Rotshtein, P. & Dolan, R.J. Neural correlates of processing valence and arousal in affective words. Cereb. Cortex 17, 742–748 (2007).

    Article  CAS  Google Scholar 

  9. Padoa-Schioppa, C. & Assad, J.A. Neurons in the orbitofrontal cortex encode economic value. Nature 441, 223–226 (2006).

    Article  CAS  Google Scholar 

  10. Kriegeskorte, N. & Kievit, R.A. Representational geometry: integrating cognition, computation and the brain. Trends Cogn. Sci. 17, 401–412 (2013).

    Article  Google Scholar 

  11. Haynes, J.D. Decoding and predicting intentions. Ann. NY Acad. Sci. 1224, 9–21 (2011).

    Article  Google Scholar 

  12. Kamitani, Y. & Tong, F. Decoding the visual and subjective contents of the human brain. Nat. Neurosci. 8, 679–685 (2005).

    Article  CAS  Google Scholar 

  13. Freeman, J., Brouwer, G.J., Heeger, D.J. & Merriam, E.P. Orientation decoding depends on maps, not columns. J. Neurosci. 31, 4792–4804 (2011).

    Article  CAS  Google Scholar 

  14. Sasaki, Y. et al. The radial bias: a different slant on visual orientation sensitivity in human and nonhuman primates. Neuron 51, 661–670 (2006).

    Article  CAS  Google Scholar 

  15. Alink, A., Krugliak, A., Walther, A. & Kriegeskorte, N. fMRI orientation decoding in V1 does not require global maps or globally coherent orientation stimuli. Front. Psychol. 4, 493 (2013).

    Article  Google Scholar 

  16. Kriegeskorte, N. et al. Matching categorical object representations in inferior temporal cortex of man and monkey. Neuron 60, 1126–1141 (2008).

    Article  CAS  Google Scholar 

  17. McNamee, D., Rangel, A. & O'Doherty, J.P. Category-dependent and category-independent goal-value codes in human ventromedial prefrontal cortex. Nat. Neurosci. 16, 479–485 (2013).

    Article  CAS  Google Scholar 

  18. Brouwer, G.J. & Heeger, D.J. Decoding and reconstructing color from responses in human visual cortex. J. Neurosci. 29, 13992–14003 (2009).

    Article  CAS  Google Scholar 

  19. Osgood, C.E., May, W.H. & Miron, M.S. Cross-Cultural Universals of Affective Meaning (University of Illinois Press, Urbana, Illinois, 1975).

  20. Russell, J.A. A circumplex model of affect. J. Pers. Soc. Psychol. 39, 1161–1178 (1980).

    Article  Google Scholar 

  21. Grill-Spector, K. & Malach, R. The human visual cortex. Annu. Rev. Neurosci. 27, 649–677 (2004).

    Article  CAS  Google Scholar 

  22. Anderson, A.K. et al. Dissociated neural representations of intensity and valence in human olfaction. Nat. Neurosci. 6, 196–202 (2003).

    Article  CAS  Google Scholar 

  23. O'Doherty, J., Kringelbach, M.L., Rolls, E.T., Hornak, J. & Andrews, C. Abstract reward and punishment representations in the human orbitofrontal cortex. Nat. Neurosci. 4, 95–102 (2001).

    Article  CAS  Google Scholar 

  24. Small, D.M. et al. Dissociation of neural representation of intensity and affective valuation in human gustation. Neuron 39, 701–711 (2003).

    Article  CAS  Google Scholar 

  25. Ongür, D. & Price, J.L. The organization of networks within the orbital and medial prefrontal cortex of rats, monkeys and humans. Cereb. Cortex 10, 206–219 (2000).

    Article  Google Scholar 

  26. Shenhav, A., Barrett, L.F. & Bar, M. Affective value and associative processing share a cortical substrate. Cogn. Affect. Behav. Neurosci. 13, 46–59 (2013).

    Article  Google Scholar 

  27. Gottfried, J.A., O'Doherty, J. & Dolan, R.J. Appetitive and aversive olfactory learning in humans studied using event-related functional magnetic resonance imaging. J. Neurosci. 22, 10829–10837 (2002).

    Article  CAS  Google Scholar 

  28. Rolls, E.T., Kringelbach, M.L. & de Araujo, I.E. Different representations of pleasant and unpleasant odours in the human brain. Eur. J. Neurosci. 18, 695–703 (2003).

    Article  Google Scholar 

  29. Montague, P.R. & Berns, G.S. Neural economics and the biological substrates of valuation. Neuron 36, 265–284 (2002).

    Article  CAS  Google Scholar 

  30. Wilson-Mendenhall, C.D., Barrett, L.F. & Barsalou, L.W. Neural evidence that human emotions share core affective properties. Psychol. Sci. 24, 947–956 (2013).

    Article  Google Scholar 

  31. Lindquist, K.A., Wager, T.D., Kober, H., Bliss-Moreau, E. & Barrett, L.F. The brain basis of emotion: a meta-analytic review. Behav. Brain Sci. 35, 121–143 (2012).

    Article  Google Scholar 

  32. Morrison, S.E. & Salzman, C.D. The convergence of information about rewarding and aversive stimuli in single neurons. J. Neurosci. 29, 11471–11483 (2009).

    Article  CAS  Google Scholar 

  33. Todd, R.M., Talmi, D., Schmitz, T.W., Susskind, J. & Anderson, A.K. Psychophysical and neural evidence for emotion-enhanced perceptual vividness. J. Neurosci. 32, 11201–11212 (2012).

    Article  CAS  Google Scholar 

  34. Grabenhorst, F., D'Souza, A.A., Parris, B.A., Rolls, E.T. & Passingham, R.E. A common neural scale for the subjective pleasantness of different primary rewards. Neuroimage 51, 1265–1274 (2010).

    Article  Google Scholar 

  35. Raizada, R.D. & Connolly, A.C. What makes different people's representations alike: neural similarity space solves the problem of across-subject fMRI decoding. J. Cogn. Neurosci. 24, 868–877 (2012).

    Article  Google Scholar 

  36. Haxby, J.V. et al. A common, high-dimensional model of the representational space in human ventral temporal cortex. Neuron 72, 404–416 (2011).

    Article  CAS  Google Scholar 

  37. Kron, A., Goldstein, A., Lee, D.H., Gardhouse, K. & Anderson, A.K. How are you feeling? Revisiting the quantification of emotional qualia. Psychol. Sci. 24, 1503–1511 (2013).

    Article  Google Scholar 

  38. Kriegeskorte, N., Goebel, R. & Bandettini, P. Information-based functional brain mapping. Proc. Natl. Acad. Sci. USA 103, 3863–3868 (2006).

    Article  CAS  Google Scholar 

  39. Dolcos, F., LaBar, K.S. & Cabeza, R. Dissociable effects of arousal and valence on prefrontal activity indexing emotional evaluation and subsequent memory: an event-related fMRI study. Neuroimage 23, 64–74 (2004).

    Article  Google Scholar 

  40. Lazarus, R.S. & Folkman, S. Stress, Appraisal and Coping (Springer Publishing Company, 1984).

  41. Small, D.M. et al. Human cortical gustatory areas: a review of functional neuroimaging data. Neuroreport 10, 7–14 (1999).

    Article  CAS  Google Scholar 

  42. Lim, S.L., O'Doherty, J.P. & Rangel, A. Stimulus value signals in ventromedial PFC reflect the integration of attribute value signals computed in fusiform gyrus and posterior superior temporal gyrus. J. Neurosci. 33, 8729–8741 (2013).

    Article  CAS  Google Scholar 

  43. Mogami, T. & Tanaka, K. Reward association affects neuronal responses to visual stimuli in macaque te and perirhinal cortices. J. Neurosci. 26, 6761–6770 (2006).

    Article  CAS  Google Scholar 

  44. Poellinger, A. et al. Activation and habituation in olfaction: an fMRI study. Neuroimage 13, 547–560 (2001).

    Article  CAS  Google Scholar 

  45. Misaki, M., Kim, Y., Bandettini, P.A. & Kriegeskorte, N. Comparison of multivariate classifiers and response normalizations for pattern-information fMRI. Neuroimage 53, 103–118 (2010).

    Article  Google Scholar 

  46. Olson, I.R., Plotzker, A. & Ezzyat, Y. The enigmatic temporal pole: a review of findings on social and emotional processing. Brain 130, 1718–1731 (2007).

    Article  Google Scholar 

  47. Lapid, H. et al. Neural activity at the human olfactory epithelium reflects olfactory perception. Nat. Neurosci. 14, 1455–1461 (2011).

    Article  CAS  Google Scholar 

  48. Li, W., Howard, J.D., Parrish, T.B. & Gottfried, J.A. Aversive learning enhances perceptual and cortical discrimination of indiscriminable odor cues. Science 319, 1842–1845 (2008).

    Article  CAS  Google Scholar 

  49. Noonan, M.P. et al. Separate value comparison and learning mechanisms in macaque medial and lateral orbitofrontal cortex. Proc. Natl. Acad. Sci. USA 107, 20547–20552 (2010).

    Article  CAS  Google Scholar 

  50. Chapman, H.A., Kim, D.A., Susskind, J.M. & Anderson, A.K. In bad taste: evidence for the oral origins of moral disgust. Science 323, 1222–1226 (2009).

    Article  CAS  Google Scholar 

  51. Tzourio-Mazoyer, N. et al. Automated anatomical labeling of activations in SPM using a macroscopic anatomical parcellation of the MNI MRI single-subject brain. Neuroimage 15, 273–289 (2002).

    Article  CAS  Google Scholar 

  52. Eickhoff, S.B. et al. A new SPM toolbox for combining probabilistic cytoarchitectonic maps and functional imaging data. Neuroimage 25, 1325–1335 (2005).

    Article  Google Scholar 

  53. Reinhard, E.S.M., Shirley, P. & Ferwerda, J. Photographic tone reproduction for digital images. ACM Trans. Graph. 21, 267–276 (2002).

    Article  Google Scholar 

  54. Itti, L. & Koch, C. Computational modelling of visual attention. Nat. Rev. Neurosci. 2, 194–203 (2001).

    Article  CAS  Google Scholar 

  55. Walther, D. & Koch, C. Modeling attention to salient proto-objects. Neural Netw. 19, 1395–1407 (2006).

    Article  Google Scholar 

  56. Naselaris, T., Stansbury, D.E. & Gallant, J.L. Cortical representation of animate and inanimate objects in complex natural scenes. J. Physiol. Paris 106, 239–249 (2012).

    Article  Google Scholar 

  57. Op de Beeck, H., Wagemans, J. & Vogels, R. Inferotemporal neurons represent low-dimensional configurations of parameterized shapes. Nat. Neurosci. 4, 1244–1252 (2001).

    Article  CAS  Google Scholar 

Download references

Acknowledgements

We thank T. Schmitz, M. Taylor, D. Hamilton and K. Gardhouse for technical collaboration and discussion. This work was funded by Canadian Institutes of Health Research grants to A.K.A. J.C. was supported by the Japan Society for the Promotion of Science Postdoctoral Fellowships for Research Abroad (H23).

Author information

Authors and Affiliations

Authors

Contributions

J.C. and A.K.A. designed the experiments. J.C. and D.H.L. built the experimental apparatus and performed the experiments. J.C. analyzed the data. J.C., D.H.L., N.K. and A.K.A. wrote the paper. N.K. and A.K.A. supervised the study.

Corresponding authors

Correspondence to Junichi Chikazoe or Adam K Anderson.

Ethics declarations

Competing interests

The authors declare no competing financial interests.

Integrated supplementary information

Supplementary Figure 1 Distribution of properties

(a) Distribution of individual low-level visual features (saliency, luminance, hue, number of edges and contrast). Data were sorted based on standardized metric (z). (b) Distribution of visual feature, animacy, and valence in 13 bins.

Supplementary Figure 2 Multidimensional scaling (MDS) plots of early visual cortex (EVC), ventral temporal cortex (VTC), and orbitofrontal cortex (OFC) with optimally fit property scales.

Color denotes scores of visual feature (top), animacy (middle) and valence (bottom). Correlation between projections on the best-fitting axis (line in each MDS plot) and property values is shown below each MDS plot. P values for correlations for visual features in EVC, VTC and OFC are 0.00001, 0.45 and 0.02, respectively. P values for correlations for animacy in EVC, VTC and OFC are 0.16, 7.7 × 10-23 and 0.0000002, respectively. P values for correlations for valence in EVC, VTC and OFC are 0.46, 0.001 and 0.00000005 respectively. n =128 trials.

Supplementary Figure 3 GLM decomposition of multivoxel activity patterns

Relationship between distance in visual, animacy, and valence predictors and activation similarity was examined by GLM regression with rank-ordered correlations as the dependent variable. Resulting GLM coefficients (multiplied by minus 1) provided a Distance Correspondence Index (DCI) between activity patterns and property types, and were subjected to one-sample t-tests across participants. DCI is calculated by multiplying minus 1 and beta coefficient in the GLM decomposition formula.

Supplementary Figure 4 Construction of representational similarity matrices (RSM) for each property.

Representational similarity matrices were constructed to visualize how each distinct variable (i.e. visual features, animacy, and valence) was associated with multivoxel activation patterns. Illustrated is the steps taken to create a valence RSM. First, other properties (visual features and animacy) were regressed out, along with regressors of no interest from the rank-ordered correlations. The residual correlations, now predicted only by valence, were then sorted based valence scores (13x13). The ideal valence RSM is illustrated at the bottom, where the main diagonal (top left to bottom right) indicates greater activity pattern correlations with valence similarity and the off diagonal reveals lower activity pattern correlations with valence dissimilarity.

Supplementary Figure 5 Representational similarity matrices by region and feature type

Averaged activity pattern correlations in the EVC, VTC and OFC were rank-ordered with respect to visual features (top), animacy (middle), and valence (bottom).

Supplementary Figure 6 Multivariate searchlight results for individual visual features (saliency, luminance, hue, number of edges, and contrast).

Visual features were primarily coded by occipital, temporal, and parahippocampal cortices.

Supplementary Figure 7 Cross-participant classification of picture item representations in the VTC and OFC.

Each participant's trial x trial RSM of r-coefficients was compared against a trial x trial RSM of r-coefficients estimated from all other participants' RSM, in a leave-one-out procedure. Each target picture was represented by 127 values that related it to all other picture trials (this is analogous to each picture being represented as coordinates in 127-dimension space). We then compared whether the target picture representation was more similar to the cross-participant estimate than all other picture representations, with similarity computed as the correlation of the r-coefficients. Classification performance was calculated as the percentage success of all pairwise comparisons (50% chance).

Supplementary Figure 8 Cross-participant classification of valence.

We used a similar leave-one-out procedure as shown in Supplementary Fig. 6, however, this time comparing each target picture's valence representation to its cross-participant prediction, estimated by computing all other participants' positivity and negativity correlation matrices, which were combined across items into separate positive (7 x7) and negative (7 x7) similarity matrices. As an example, consider a target picture j, which was rated as positive = 5, negative = 1 for one participant. It has 127 scores relating it to all other pictures, each rated with its own valence. The first of picture j's 127 scores, r(j,1), relates it to picture 1, which was rated as positive = 2, negative = 2. These scores (positive 5, 2; negative 1, 2) were looked up in the estimated positive and negative similarity matrices and averaged. This process was repeated for all of picture j's 127 scores. Then if the correlation of these scores was higher for picture j's valence than another picture k's valence representation, the valence classification was successful. Classification performance calculated as the percentage success of all pairwise comparisons (50 % chance).

Supplementary information

Supplementary Text and Figures

Supplementary Figures 1–8 and Supplementary Tables 1–5 (PDF 2659 kb)

Supplementary Methods Checklist (PDF 499 kb)

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Chikazoe, J., Lee, D., Kriegeskorte, N. et al. Population coding of affect across stimuli, modalities and individuals. Nat Neurosci 17, 1114–1122 (2014). https://doi.org/10.1038/nn.3749

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1038/nn.3749

This article is cited by

Search

Quick links

Nature Briefing

Sign up for the Nature Briefing newsletter — what matters in science, free to your inbox daily.

Get the most important science stories of the day, free in your inbox. Sign up for Nature Briefing