Abstract
Over the past years, numerous methods have emerged to automate the quantification of animal behavior at a resolution not previously imaginable. This has opened up a new field of computational ethology and will, in the near future, make it possible to quantify in near completeness what an animal is doing as it navigates its environment. The importance of improving the techniques with which we characterize behavior is reflected in the emerging recognition that understanding behavior is an essential (or even prerequisite) step to pursuing neuroscience questions. The use of these methods, however, is not limited to studying behavior in the wild or in strictly ethological settings. Modern tools for behavioral quantification can be applied to the full gamut of approaches that have historically been used to link brain to behavior, from psychophysics to cognitive tasks, augmenting those measurements with rich descriptions of how animals navigate those tasks. Here we review recent technical advances in quantifying behavior, particularly in methods for tracking animal motion and characterizing the structure of those dynamics. We discuss open challenges that remain for behavioral quantification and highlight promising future directions, with a strong emphasis on emerging approaches in deep learning, the core technology that has enabled the markedly rapid pace of progress of this field. We then discuss how quantitative descriptions of behavior can be leveraged to connect brain activity with animal movements, with the ultimate goal of resolving the relationship between neural circuits, cognitive processes and behavior.
This is a preview of subscription content, access via your institution
Access options
Access Nature and 54 other Nature Portfolio journals
Get Nature+, our best-value online-access subscription
$29.99 / 30 days
cancel any time
Subscribe to this journal
Receive 12 print issues and online access
$209.00 per year
only $17.42 per issue
Buy this article
- Purchase on Springer Link
- Instant access to full article PDF
Prices may be subject to local taxes which are calculated during checkout
Similar content being viewed by others
References
Branson, K., Robie, A. A., Bender, J., Perona, P. & Dickinson, M. H. High-throughput ethomics in large groups of Drosophila. Nat. Methods 6, 451–457 (2009).
Geuther, B. Q. et al. Robust mouse tracking in complex environments using neural networks. Commun Biol 2, 124 (2019).
Anderson, D. J. & Perona, P. Toward a science of computational ethology. Neuron 84, 18–31 (2014).
Robie, A. A., Seagraves, K. M., Egnor, S. E. R. & Branson, K. Machine vision methods for analyzing social interactions. J. Exp. Biol. 220, 25–34 (2017).
Sridhar, V. H., Roche, D. G. & Gingins, S. Tracktor: image‐based automated tracking of animal movement and behaviour. Methods Ecol. Evol. 10, 815–820 (2019).
Rodriguez, A. et al. ToxTrac: a fast and robust software for tracking organisms. Methods Ecol. Evol. 9, 460–464 (2018).
Ohayon, S., Avni, O., Taylor, A. L., Perona, P. & Roian Egnor, S. E. Automated multi-day tracking of marked mice for the analysis of social behaviour. J. Neurosci. Methods 219, 10–19 (2013).
Gal, A., Saragosti, J. & Kronauer, D. J. C. anTraX: high throughput video tracking of color-tagged insects. Preprint at bioRxiv https://doi.org/10.1101/2020.04.29.068478 (2020).
de Chaumont, F. et al. Real-time analysis of the behaviour of groups of mice via a depth-sensing camera and machine learning. Nat. Biomed. Eng. 3, 930–942 (2019).
Krakauer, J. W., Ghazanfar, A. A., Gomez-Marin, A., MacIver, M. A. & Poeppel, D. Neuroscience Needs Behavior: Correcting a Reductionist Bias. Neuron 93, 480–490 (2017).
Ciaparrone, G. et al. Deep learning in video multi-object tracking: a survey. Preprint at arXiv https://arxiv.org/abs/1907.12740 (2019).
Schroff, F., Kalenichenko, D. & Philbin, J. FaceNet: a unified embedding for face recognition and clustering. in Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR) 815–823 (2015).
Khan, M. H. et al. AnimalWeb: a large-scale hierarchical dataset of annotated animal faces. in Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) 6939–6948 (2020).
Romero-Ferrero, F., Bergomi, M. G., Hinz, R. C., Heras, F. J. H. & de Polavieja, G. G. idtracker.ai: tracking all individuals in small or large collectives of unmarked animals. Nat. Methods 16, 179–182 (2019).
Bozek, K., Hebert, L., Portugal, Y. & Stephens, G. J. Markerless tracking of an entire insect colony. Preprint at bioRxiv https://doi.org/10.1101/2020.03.26.007302 (2020).
Karthik, S., Prabhu, A. & Gandhi, V. Simple unsupervised multi-object tracking. Preprint at arXiv https://arxiv.org/abs/2006.02609 (2020).
Johansson, G. Visual perception of biological motion and a model for its analysis. Percept. Psychophys. 14, 201–211 (1973).
Marr, D. & Vaina, L. Representation and recognition of the movements of shapes. Proc. R. Soc. Lond. B Biol. Sci. 214, 501–524 (1982).
O’Rourke, J. & Badler, N.I. Model-based image analysis of human motion using constraint propagation. in IEEE Transactions on Pattern Analysis and Machine Intelligence (PAMI-2), 522–536 (1980).
Carreira, J., Agrawal, P., Fragkiadaki, K. & Malik, J. Human pose estimation with iterative error feedback. in Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) 4733–4742 (2016).
Wei, S.-E., Ramakrishna, V., Kanade, T. & Sheikh, Y. Convolutional pose machines. in Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) 4724–4732 (2016).
Newell, A., Yang, K. & Deng, J. Stacked hourglass networks for human pose estimation. in Computer Vision – ECCV 2016 483–499 (Springer, 2016).
Lin, T.-Y. et al. Microsoft COCO: common objects in context. in Computer Vision – ECCV 2014 740–755 (Springer, 2014).
Andriluka, M., Pishchulin, L., Gehler, P. & Schiele, B. 2D human pose estimation: new benchmark and state of the art analysis. in Proc. IEEE Conf. Computer Vision and Pattern Recognition 3686–3693 (2014).
Ionescu, C., Papava, D., Olaru, V. & Sminchisescu, C. Human3.6M: large scale datasets and predictive methods for 3D human sensing in natural environments. IEEE Trans. Pattern Anal. Mach. Intell. 36, 1325–1339 (2014).
Mathis, A. et al. DeepLabCut: markerless pose estimation of user-defined body parts with deep learning. Nat. Neurosci. 21, 1281–1289 (2018).
Russakovsky, O. et al. ImageNet large scale visual recognition challenge. Int. J. Comput. Vis. 115, 211–252 (2015).
Kornblith, S., Shlens, J. & Le, Q. V. in Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) 2661–2671 (2019).
He, K., Girshick, R. & Dollár, P. Rethinking ImageNet pre-training. in Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV) 4918–4927 (2019).
Mathis, A., Yüksekgönül, M., Rogers, B., Bethge, M. & Mathis, M. W. Pretraining boosts out-of-domain robustness for pose estimation. Preprint at arXiv https://arxiv.org/abs/1909.11229 (2019).
Pereira, T. D. et al. SLEAP: multi-animal pose tracking. Preprint at bioRxiv https://doi.org/10.1101/2020.08.31.276246 (2020).
Pereira, T. D. et al. Fast animal pose estimation using deep neural networks. Nat. Methods 16, 117–125 (2019).
Graving, J. M. et al. DeepPoseKit, a software toolkit for fast and robust animal pose estimation using deep learning. eLife 8, e47994 (2019).
He, K., Zhang, X., Ren, S. & Sun, J. Deep residual learning for image recognition. in Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) 770–778 (2016).
Yu, F. et al. LSUN: construction of a large-scale image dataset using deep learning with humans in the loop. Preprint at arXiv https://arxiv.org/abs/1506.03365 (2015).
Mathis, M. W. & Mathis, A. Deep learning tools for the measurement of animal behavior in neuroscience. Curr. Opin. Neurobiol. 60, 1–11 (2020).
Christin, S., Hervet, É. & Lecomte, N. Applications for deep learning in ecology. Methods Ecol. Evol. 10, 1632–1644 (2019).
Suwajanakorn, S., Snavely, N., Tompson, J. J. & Norouzi, M. Discovery of latent 3D keypoints via end-to-end geometric reasoning. Adv. Neural Inf. Process. Syst. 31 2059–2070 (2018).
Li, S. et al. Deformation-aware unpaired image translation for pose estimation on laboratory animals. in Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) 13158–13168 (2020).
Mu, J., Qiu, W., Hager, G. & Yuille, A. Learning from synthetic animals. arXiv https://arxiv.org/abs/1912.08265 (2019).
Cao, J. et al. Cross-domain adaptation for animal pose estimation. arXiv https://arxiv.org/abs/1908.05806 (2019).
Liu, X. et al. OptiFlex: video-based animal pose estimation using deep learning enhanced by optical flow. Preprint at bioRxiv https://doi.org/10.1101/2020.04.04.025494 (2020).
Wu, A., Buchanan, E.K., Whiteway, M. & Schartner, M. Deep Graph Pose: a semi-supervised deep graphical model for improved animal pose tracking. Preprint at bioRxiv https://doi.org/10.1101/2020.08.20.259705 (2020).
Musall, S., Kaufman, M. T., Juavinett, A. L., Gluf, S. & Churchland, A. K. Single-trial neural dynamics are dominated by richly varied movements. Nat. Neurosci. 22, 1677–1686 (2019).
Günel, S. et al. DeepFly3D, a deep learning-based approach for 3D limb and appendage tracking in tethered, adult Drosophila. eLife 8, 640375 (2019).
Bala, P. C. et al. Automated markerless pose estimation in freely moving macaques with OpenMonkeyStudio. Nat. Commun. 11, 4560 (2020).
Garrido-Jurado, S., Muñoz-Salinas, R., Madrid-Cuevas, F. J. & Marín-Jiménez, M. J. Automatic generation and detection of highly reliable fiducial markers under occlusion. Pattern Recognit. 47, 2280–2292 (2014).
Karashchuk, P. et al. Anipose: a toolkit for robust markerless 3D pose estimation. Preprint at bioRxiv https://doi.org/10.1101/2020.05.26.117325 (2020).
Ebbesen, C. L. & Froemke, R. C. Automatic tracking of mouse social posture dynamics by 3D videography, deep learning and GPU-accelerated robust optimization. Preprint at bioRxiv https://doi.org/10.1101/2020.05.21.109629 (2020).
Storchi, R. et al. A high-dimensional quantification of mouse defensive behaviors reveals enhanced diversity and stimulus specificity. Curr. Biol. https://doi.org/10.1016/j.cub.2020.09.007 (2020)
Chen, Y., Tian, Y. & He, M. Monocular human pose estimation: a survey of deep learning-based methods. Comput. Vis. Image Underst. 192, 102897 (2020).
Gosztolai, A. et al. LiftPose3D, a deep learning-based approach for transforming 2D to 3D pose in laboratory animals. Preprint at bioRxiv https://doi.org/10.1101/2020.09.18.292680 (2020).
Tzouanas, C.N., Kim, S., Badhiwala, K.N., Avants, B.W. & Robinson, J.T. Stable behavioral and neural responses to thermal stimulation despite large changes in the Hydra vulgaris nervous system. Preprint at bioRxiv https://doi.org/10.1101/787648 (2020).
Kearney, S., Li, W., Parsons, M., Kim, K. I. & Cosker, D. RGBD-Dog: predicting canine pose from RGBD sensors. in Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) 8336–8345 (2020).
Zuffi, S., Kanazawa, A. & Black, M.J. Lions and tigers and bears: Capturing non-rigid, 3D, articulated shape from images. in Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition 3955–3963 (2018).
Kulkarni, N., Gupta, A., Fouhey, D. F. & Tulsiani, S. Articulation-aware canonical surface mapping. in Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) 452–461 (2020).
Badger, M. et al. 3D bird reconstruction: a dataset, model, and shape recovery from a single view. Preprint at arXiv https://arxiv.org/abs/2008.06133 (2020).
Cao, Z., Simon, T., Wei, S.-E. & Sheikh, Y. Realtime multi-person 2D pose estimation using part affinity fields. in Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) 7291–7299 (2017).
Francisco, F. A., Nührenberg, P. & Jordan, A. High-resolution animal tracking with integration of environmental information in aquatic systems. Preprint at bioRxiv https://doi.org/10.1101/2020.02.25.963926 (2020).
Xiao, B., Wu, H. & Wei, Y. Simple baselines for human pose estimation and tracking. in Proceedings of the European Conference on Computer Vision (ECCV) 466–481 (2018).
Jiang, Z. et al. Detection and tracking of multiple mice using part proposal networks. Preprint at arXiv https://arxiv.org/abs/1906.02831 (2019).
Wang, M., Tighe, J. & Modolo, D. Combining detection and tracking for human pose estimation in videos. in Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) 11088–11096 (2020).
Jin, S., Liu, W., Ouyang, W. & Qian, C. Multi-person articulated tracking with spatial and temporal embeddings. in Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) 5664–5673 (2019).
Raaj, Y., Idrees, H., Hidalgo, G. & Sheikh, Y. Efficient online multi-person 2d pose tracking with recurrent spatio-temporal affinity fields. in Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) 4620–4628 (2019).
Datta, S. R., Anderson, D. J., Branson, K., Perona, P. & Leifer, A. Computational neuroethology: a call to action. Neuron 104, 11–24 (2019).
Kabra, M., Robie, A. A., Rivera-Alba, M., Branson, S. & Branson, K. JAABA: interactive machine learning for automatic annotation of animal behavior. Nat. Methods 10, 64–67 (2013).
Hong, W. et al. Automated measurement of mouse social behaviors using depth sensing, video tracking, and machine learning. Proc. Natl Acad. Sci. USA 112, E5351–E5360 (2015).
Nilsson, S. R. O. et al. Simple Behavioral Analysis (SimBA) – an open source toolkit for computer classification of complex social behaviors in experimental animals. Preprint at bioRxiv https://doi.org/10.1101/2020.04.19.049452 (2020).
Ren, B., Liu, M., Ding, R. & Liu, H. A survey on 3D skeleton-based action recognition using learning method. Preprint at arXiv https://arxiv.org/abs/2002.05907 (2020).
Levitis, D. A., Lidicker, W. Z. & Freund, G. Behavioural biologists don’t agree on what constitutes behaviour. Anim. Behav. 78, 103–110 (2009).
Szigeti, B., Stone, T. & Webb, B. Inconsistencies in C. elegans behavioural annotation. Preprint at bioRxiv https://doi.org/10.1101/066787 (2016).
Leng, X., Wohl, M., Ishii, K., Nayak, P. & Asahina, K. Quantitative comparison of Drosophila behavior annotations by human observers and a machine learning algorithm. Preprint at bioRxiv https://doi.org/10.1101/2020.06.16.153130 (2020).
Brown, A. E. X. & de Bivort, B. Ethology as a physical science. Nat. Phys. 14, 653–657 (2018).
Berman, G. J. Measuring behavior across scales. BMC Biol. 16, 23 (2018).
Berman, G. J., Choi, D. M., Bialek, W. & Shaevitz, J. W. Mapping the stereotyped behaviour of freely moving fruit flies. J. R. Soc. Interface 11, 20140672 (2014).
Todd, J. G., Kain, J. S. & de Bivort, B. L. Systematic exploration of unsupervised methods for mapping behavior. Phys. Biol. 14, 015002 (2017).
Klaus, A. et al. The spatiotemporal organization of the striatum encodes action space. Neuron 95, 1171–1180.e7 (2017).
Marques, J. C., Lackner, S., Félix, R. & Orger, M. B. Structure of the zebrafish locomotor repertoire revealed with unsupervised behavioral clustering. Curr. Biol. 28, 181–195.e5 (2018).
Hsu, A.I. & Yttri, E.A. B-SOiD: an open source unsupervised algorithm for discovery of spontaneous behaviors. Preprint at bioRxiv https://doi.org/10.1101/770271 (2020).
Zimmermann, C., Schneider, A., Alyahyay, M., Brox, T. & Diester, I. FreiPose: a deep learning framework for precise animal motion capture in 3D spaces. Preprint at bioRxiv https://doi.org/10.1101/2020.02.27.967620 (2020).
Wiltschko, A. B. et al. Mapping sub-second structure in mouse behavior. Neuron 88, 1121–1135 (2015).
Linderman, S. et al. Bayesian learning and inference in recurrent switching linear dynamical systems. in Proceedings of the 20th International Conference on Artificial Intelligence and Statistics (eds. Singh, A. & Zhu, J.) 54, 914–922 (PMLR, 2017).
Costa, A. C., Ahamed, T. & Stephens, G. J. Adaptive, locally linear models of complex dynamics. Proc. Natl Acad. Sci. USA 116, 1501–1510 (2019).
Vogelstein, J. T. et al. Discovery of brainwide neural-behavioral maps via multiscale unsupervised structure learning. Science 344, 386–392 (2014).
Berman, G. J., Bialek, W. & Shaevitz, J. W. Predictability and hierarchy in Drosophila behavior. Proc. Natl Acad. Sci. USA 113, 11943–11948 (2016).
Tao, L., Ozarkar, S., Beck, J. M. & Bhandawat, V. Statistical structure of locomotion and its modulation by odors. eLife 8, e41235 (2019).
Ligon, R.A., Scholes, E. & Sheehan, M.J. RAD-Behavior (Recombining Atomized, Discretized, Behavior): a new framework for the quantitative analysis of behavioral execution. Preprint at bioRxiv https://doi.org/10.1101/739151 (2019).
Gupta, S. & Gomez-Marin, A. A context-free grammar for Caenorhabditis elegans behavior. Preprint at bioRxiv https://doi.org/10.1101/708891 (2019).
Stephens, G. J., Johnson-Kerner, B., Bialek, W. & Ryu, W. S. Dimensionality and dynamics in the behavior of C. elegans. PLOS Comput. Biol. 4, e1000028 (2008).
Mearns, D. S., Donovan, J. C., Fernandes, A. M., Semmelhack, J. L. & Baier, H. Deconstructing hunting behavior reveals a tightly coupled stimulus-response loop. Curr. Biol. 30, 54–69.e9 (2020).
DeAngelis, B. D., Zavatone-Veth, J. A. & Clark, D. A. The manifold structure of limb coordination in walking Drosophila. eLife 8, e46409 (2019).
Su, K., Liu, X. & Shlizerman, E. PREDICT & CLUSTER: unsupervised skeleton based action recognition. in Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) 9631–9640 (2020).
Graving, J. M. & Couzin, I. D. VAE-SNE: a deep generative model for simultaneous dimensionality reduction and clustering. Preprint at bioRxiv https://doi.org/10.1101/2020.07.17.207993 (2020).
Johnson, M. J., Duvenaud, D. K., Wiltschko, A., Adams, R. P. & Datta, S. R. Composing graphical models with neural networks for structured representations and fast inference. Adv. Neural Inf. Process. Syst. 29, 2946–2954 (2016).
Luxem, K., Fuhrmann, F., Kürsch, J., Remy, S. & Bauer, P. Identifying behavioral structure from deep variational embeddings of animal motion. Preprint at bioRxiv https://doi.org/10.1101/2020.05.14.095430 (2020).
Harvey, C. D., Collman, F., Dombeck, D. A. & Tank, D. W. Intracellular dynamics of hippocampal place cells during virtual navigation. Nature 461, 941–946 (2009).
Stowers, J. R. et al. Virtual reality for freely moving animals. Nat. Methods 14, 995–1002 (2017).
Haberkern, H. et al. Visually guided behavior and optogenetically induced learning in head-fixed flies exploring a virtual landscape. Curr. Biol. 29, 1647–1659.e8 (2019).
Naik, H., Bastien, R., Navab, N. & Couzin, I. D. Animals in virtual environments. IEEE Trans. Vis. Comput. Graph. 26, 2073–2083 (2020).
Robie, A. A. et al. Mapping the neural substrates of behavior. Cell 170, 393–406.e28 (2017).
Cande, J. et al. Optogenetic dissection of descending behavioral control in Drosophila. eLife 7, e34275 (2018).
Mimica, B., Dunn, B. A., Tombaz, T., Bojja, V. P. T. N. C. S. & Whitlock, J. R. Efficient cortical coding of 3D posture in freely behaving rats. Science 362, 584–589 (2018).
Markowitz, J. E. et al. The striatum organizes 3D behavior via moment-to-moment action selection. Cell 174, 44–58.e17 (2018).
Marques, J. C., Li, M., Schaak, D., Robson, D. N. & Li, J. M. Internal state dynamics shape brainwide activity and foraging behaviour. Nature 577, 239–243 (2020).
Kaplan, H. S., Salazar Thula, O., Khoss, N. & Zimmer, M. Nested neuronal dynamics orchestrate a behavioral hierarchy across timescaleS. Neuron 105, 562–576.e9 (2020).
Zhang, W. & Yartsev, M. M. Correlated neural activity across the brains of socially interacting bats. Cell 178, 413–428.e22 (2019).
Kingsbury, L. et al. Correlated neural activity and encoding of behavior across brains of socially interacting animals. Cell 178, 429–446.e16 (2019).
Klibaite, U. & Shaevitz, J. W. Paired fruit flies synchronize behavior: uncovering social interactions in Drosophila melanogaster. PLoS Comput. Biol. 16, e1008230 (2020)
Gepner, R., Mihovilovic Skanata, M., Bernat, N. M., Kaplow, M. & Gershow, M. Computations underlying Drosophila photo-taxis, odor-taxis, and multi-sensory integration. eLife 4, e06229 (2015).
Calhoun, A. J., Pillow, J. W. & Murthy, M. Unsupervised identification of the internal states that shape natural behavior. Nat. Neurosci. 22, 2040–2049 (2019).
Maesani, A. et al. Fluctuation-driven neural dynamics reproduce Drosophila locomotor patterns. PLOS Comput. Biol. 11, e1004577 (2015).
Kim, J., Santos, J.A., Alkema, M.J. & Shlizerman, E. Whole integration of neural connectomics, dynamics and bio-mechanics for identification of behavioral sensorimotor pathways in Caenorhabditis elegans. Preprint at bioRxiv https://doi.org/10.1101/724328 (2019).
Merel, J. et al. Deep neuroethology of a virtual rodent. in International Conference on Learning Representations https://openreview.net/forum?id=SyxrxR4KPS (2020).
Crosby, M., Beyret, B. & Halina, M. The Animal-AI Olympics. Nature Machine Intelligence 1, 257 (2019).
Eyjolfsdottir, E., Branson, K., Yue, Y. & Perona, P. Learning recurrent representations for hierarchical behavior modeling. in International Conference on Learning Representations https://openreview.net/forum?id=BkLhzHtlg (2017).
Teng, M., Le, T. A., Scibior, A. & Wood, F. Imitation learning of factored multi-agent reactive models. Preprint at arXiv https://arxiv.org/abs/1903.04714 (2019).
Dorkenwald, S. et al. FlyWire: online community for whole-brain connectomics. Preprint at bioRxiv https://doi.org/10.1101/2020.08.30.274225 (2020).
Michaels, J. A., Schaffelhofer, S., Agudelo-Toro, A. & Scherberger, H. A modular neural network model of grasp movement generation. Preprint at bioRxiv https://doi.org/10.1101/742189 (2020).
Scidraw. Mouse top https://doi.org/10.5281/zenodo.3925916 (2020).
BioRender. https://biorender.com (2020).
Segalin, C. et al. The Mouse Action Recognition System (MARS): a software pipeline for automated analysis of social behaviors in mice. Preprint at bioRxiv https://doi.org/10.1101/2020.07.26.222299 (2020).
Klibaite, U., Berman, G. J., Cande, J., Stern, D. L. & Shaevitz, J. W. An unsupervised method for quantifying the behavior of paired animals. Phys. Biol. 14, 015006 (2017).
Stringer, C. et al. Spontaneous behaviors drive multidimensional, brainwide activity. Science 364, 255 (2019).
Batty, E. et al. BehaveNet: nonlinear embedding and Bayesian neural decoding of behavioral videos. Adv. Neural Inform. Process. Syst. 32, 15706–15717 (2019).
Dolensek, N., Gehrlach, D. A., Klein, A. S. & Gogolla, N. Facial expressions of emotion states and their neuronal correlates in mice. Science 368, 89–94 (2020).
Gupta, P. et al. Quo vadis, skeleton action recognition? Preprint at arXiv https://arxiv.org/abs/2007.02072 (2020).
Carreira, J. & Zisserman, A. Quo vadis, action recognition? A new model and the kinetics dataset. in Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) 6299–6308 (2017).
Ravbar, P., Branson, K. & Simpson, J. H. An automatic behavior recognition system classifies animal behaviors using movements and their temporal context. J. Neurosci. Methods 326, 108352 (2019).
Bohnslav, J. P. et al. DeepEthogram: a machine learning pipeline for supervised behavior classification from raw pixels. Preprint at bioRxiv https://doi.org/10.1101/2020.09.24.312504 (2020).
Arthur, B. J., Sunayama-Morita, T., Coen, P., Murthy, M. & Stern, D. L. Multi-channel acoustic recording and automated analysis of Drosophila courtship songs. BMC Biol. 11, 11 (2013).
Pearre, B., Perkins, L. N., Markowitz, J. E. & Gardner, T. J. A fast and accurate zebra finch syllable detector. PLoS One 12, e0181992 (2017).
Van Segbroeck, M., Knoll, A. T., Levitt, P. & Narayanan, S. MUPET-Mouse Ultrasonic Profile ExTraction: a signal processing tool for rapid and unsupervised analysis of ultrasonic vocalizations. Neuron 94, 465–485.e5 (2017).
Sangiamo, D. T., Warren, M. R. & Neunuebel, J. P. Ultrasonic signals associated with different types of social behavior of mice. Nat. Neurosci. 23, 411–422 (2020).
Coen, P. et al. Dynamic sensory cues shape song structure in Drosophila. Nature 507, 233–237 (2014).
Coffey, K. R., Marx, R. G. & Neumaier, J. F. DeepSqueak: a deep learning-based system for detection and analysis of ultrasonic vocalizations. Neuropsychopharmacology 44, 859–868 (2019).
Fonseca, A. H. O., Santana, G. M., Bampi, S. & Dietrich, M. O. Analysis of ultrasonic vocalizations from mice using computer vision and machine learning. Preprint at bioRxiv https://doi.org/10.1101/2020.05.20.105023 (2020).
Cohen, Y., Nicholson, D. A. & Gardner, T. J. TweetyNet: a neural network that enables high-throughput, automated annotation of birdsong. Preprint at bioRxiv https://doi.org/10.1101/2020.08.28.272088 (2020).
Clemens, J. et al. Discovery of a new song mode in Drosophila reveals hidden structure in the sensory and neural drivers of behavior. Curr. Biol. 28, 2400–2412.e6 (2018).
Mackevicius, E. L. et al. Unsupervised discovery of temporal sequences in high-dimensional datasets, with applications to neuroscience. eLife 8, e38471 (2019).
Tabler, J. M. et al. Cilia-mediated Hedgehog signaling controls form and function in the mammalian larynx. eLife 6, e19153 (2017).
Sainburg, T., Thielk, M. & Gentner, T.Q. Latent space visualization, characterization, and generation of diverse vocal communication signals. Preprint at bioRxiv https://doi.org/10.1101/870311 (2019).
Goffinet, J., Mooney, R. & Pearson, J. Inferring low-dimensional latent descriptions of animal vocalizations. Preprint at bioRxiv https://doi.org/10.1101/811661 (2019).
Acknowledgements
T.D.P. is supported by NSF GRFP (DGE-1148900) and the Princeton Porter Ogden Jacobus Fellowship. M.M. and J.W.S. are supported by an NIH BRAIN Initiative R01 (R01 NS104899) and an NSF Physics Frontier Center grant (NSF PHY-1734030). M.M. is also supported by an HHMI Faculty Scholar award and an NIH NINDS R35 research program award. We thank B. Cowley and J. Pillow for very helpful comments and suggestions.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Competing interests
The authors declare no competing interests.
Additional information
Peer review information Nature Neuroscience thanks Ann Kennedy, Pavan Ramdya, and the other, anonymous, reviewer(s) for their contribution to the peer review of this work.
Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Pereira, T.D., Shaevitz, J.W. & Murthy, M. Quantifying behavior to understand the brain. Nat Neurosci 23, 1537–1549 (2020). https://doi.org/10.1038/s41593-020-00734-z
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1038/s41593-020-00734-z
This article is cited by
-
Neural circuits regulating prosocial behaviors
Neuropsychopharmacology (2023)
-
Classifying interpersonal synchronization states using a data-driven approach: implications for social interaction understanding
Scientific Reports (2023)
-
Zebrafish Larvae Position Tracker (Z-LaP Tracker): a high-throughput deep-learning behavioral approach for the identification of calcineurin pathway-modulating drugs using zebrafish larvae
Scientific Reports (2023)
-
Overcoming the Domain Gap in Neural Action Representations
International Journal of Computer Vision (2023)
-
Approximating the Manifold Structure of Attributed Incentive Salience from Large-scale Behavioural Data
Computational Brain & Behavior (2023)