Emotions are one of the most intriguing products of brain function, yet, we still lack a mechanistic understanding of how emotions arise in the brain [1]. One of the major hindrances to studying the neuronal mechanisms of emotion lies in the difficulty to link observable behavior with internal states and ongoing neural activity [1].

In a recent study, we employed machine-vision and -learning approaches to investigate facial expressions of emotions and their neuronal correlates in mice [2]. Our work built on prior evidence that mice use their orofacial musculature to react to sensory stimuli such as tastants [3] or pain [4]. To assess facial expressions objectively and to detect previously uncharacterized changes, we recorded close-up videos of head-fixed mice and extracted features from video frames using “histograms of oriented gradients” (HOG), a machine-vision technique that represents the statistics of local image features. This allowed us to compare facial expressions of mice reacting to emotion events quantitatively in an unsupervised manner using hierarchical clustering. We next asked whether we could separate facial expressions exhibited upon diverse emotionally salient events into distinct categories by reducing the dimensionality of our data via principal component analysis (PCA) followed by t-distributed stochastic neighbor embedding (t-SNE). This approach separated facial expressions into discrete emotions.

Further, we were able to train a random forest classifier, a supervised machine-learning algorithm, to distinguish different facial expressions across mice with >90% accuracy. We next created “prototypical faces” which enabled us to resolve the intensity of each emotion at millisecond timescales and confirmed that facial expressions corresponded to emotional states, rather than reflex-like reactions. Indeed, exploiting the quantitative nature of our machine-vision approach allowed us to demonstrate that facial expressions revealed core properties of emotion such as intensity, persistence, flexibility, and valence [1]. Furthermore, the same facial expressions that resulted from external sensory triggers were also evoked by optogenetic manipulations in emotion-relevant brain circuits. Finally, we aligned facial tracking with neural activity recordings via two-photon calcium imaging and identified single neurons in the insular cortex whose activity closely correlated with specific emotional facial expressions.

Our results establish a framework for quantitative and objective assessments of distinct emotion features and categories in mice, one of the most prevalent model organisms in neuroscience. Machine-learning approaches, as utilized in our study, hold promise to unravel the neuronal underpinnings of emotion processes and gain better definition of emotion states since they can identify and describe previously uncharacterized emotion-related changes at high temporal resolution. The use of such quantitative assessments of affective states further opens new doors to psychopharmacological research investigating how substances affect emotional states. Insights into the nature and mechanisms of emotion processing are of uttermost importance for the clinic, since many psychiatric disorders involve emotional dysfunction.

While facial expressions constitute only one aspect of affective states, the current surge of supervised and unsupervised machine-learning approaches [5, 6] presents an unprecedented opportunity to create multi-dimensional models of affect through a combination of accurately tracked changes in animal posture, behavioral patterns and physiological changes along with large-scale neural recordings. While the relationships between these distinct parameters are complex, artificial intelligence approaches hold promise for reducing this high dimensionality and thus overcome current limitations in creating mechanistic hypotheses of how the brain generates emotion.

Funding and disclosure

This work was supported by the Max Planck Society and the European Research Council (ERC) under the European Union’s Horizon 2020 research and innovation program (ERC2017-STG, grant agreement no. 758448 to N.G.). Unfortunately, we are unable to comprehensively cite all relevant literature due to space limitations and would thus like to thank colleagues across the fields of ethology, computational biology, artificial intelligence and affective neuroscience for inspiration. The authors declare no competing interests.