Skip to main content

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

  • Review Article
  • Published:

Quantifying behavior to understand the brain

Abstract

Over the past years, numerous methods have emerged to automate the quantification of animal behavior at a resolution not previously imaginable. This has opened up a new field of computational ethology and will, in the near future, make it possible to quantify in near completeness what an animal is doing as it navigates its environment. The importance of improving the techniques with which we characterize behavior is reflected in the emerging recognition that understanding behavior is an essential (or even prerequisite) step to pursuing neuroscience questions. The use of these methods, however, is not limited to studying behavior in the wild or in strictly ethological settings. Modern tools for behavioral quantification can be applied to the full gamut of approaches that have historically been used to link brain to behavior, from psychophysics to cognitive tasks, augmenting those measurements with rich descriptions of how animals navigate those tasks. Here we review recent technical advances in quantifying behavior, particularly in methods for tracking animal motion and characterizing the structure of those dynamics. We discuss open challenges that remain for behavioral quantification and highlight promising future directions, with a strong emphasis on emerging approaches in deep learning, the core technology that has enabled the markedly rapid pace of progress of this field. We then discuss how quantitative descriptions of behavior can be leveraged to connect brain activity with animal movements, with the ultimate goal of resolving the relationship between neural circuits, cognitive processes and behavior.

This is a preview of subscription content, access via your institution

Access options

Buy this article

Prices may be subject to local taxes which are calculated during checkout

Fig. 1: Tracking, from coarse to fine.
Fig. 2: Anatomy of pose estimation systems.
Fig. 3: Quantifying behavioral dynamics.
Fig. 4: Approaches to linking brain to behavior.

Similar content being viewed by others

References

  1. Branson, K., Robie, A. A., Bender, J., Perona, P. & Dickinson, M. H. High-throughput ethomics in large groups of Drosophila. Nat. Methods 6, 451–457 (2009).

    CAS  PubMed  PubMed Central  Google Scholar 

  2. Geuther, B. Q. et al. Robust mouse tracking in complex environments using neural networks. Commun Biol 2, 124 (2019).

    PubMed  PubMed Central  Google Scholar 

  3. Anderson, D. J. & Perona, P. Toward a science of computational ethology. Neuron 84, 18–31 (2014).

    CAS  PubMed  Google Scholar 

  4. Robie, A. A., Seagraves, K. M., Egnor, S. E. R. & Branson, K. Machine vision methods for analyzing social interactions. J. Exp. Biol. 220, 25–34 (2017).

    PubMed  Google Scholar 

  5. Sridhar, V. H., Roche, D. G. & Gingins, S. Tracktor: image‐based automated tracking of animal movement and behaviour. Methods Ecol. Evol. 10, 815–820 (2019).

    Google Scholar 

  6. Rodriguez, A. et al. ToxTrac: a fast and robust software for tracking organisms. Methods Ecol. Evol. 9, 460–464 (2018).

    Google Scholar 

  7. Ohayon, S., Avni, O., Taylor, A. L., Perona, P. & Roian Egnor, S. E. Automated multi-day tracking of marked mice for the analysis of social behaviour. J. Neurosci. Methods 219, 10–19 (2013).

    PubMed  PubMed Central  Google Scholar 

  8. Gal, A., Saragosti, J. & Kronauer, D. J. C. anTraX: high throughput video tracking of color-tagged insects. Preprint at bioRxiv https://doi.org/10.1101/2020.04.29.068478 (2020).

  9. de Chaumont, F. et al. Real-time analysis of the behaviour of groups of mice via a depth-sensing camera and machine learning. Nat. Biomed. Eng. 3, 930–942 (2019).

    PubMed  Google Scholar 

  10. Krakauer, J. W., Ghazanfar, A. A., Gomez-Marin, A., MacIver, M. A. & Poeppel, D. Neuroscience Needs Behavior: Correcting a Reductionist Bias. Neuron 93, 480–490 (2017).

    CAS  PubMed  Google Scholar 

  11. Ciaparrone, G. et al. Deep learning in video multi-object tracking: a survey. Preprint at arXiv https://arxiv.org/abs/1907.12740 (2019).

  12. Schroff, F., Kalenichenko, D. & Philbin, J. FaceNet: a unified embedding for face recognition and clustering. in Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR) 815–823 (2015).

  13. Khan, M. H. et al. AnimalWeb: a large-scale hierarchical dataset of annotated animal faces. in Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) 6939–6948 (2020).

  14. Romero-Ferrero, F., Bergomi, M. G., Hinz, R. C., Heras, F. J. H. & de Polavieja, G. G. idtracker.ai: tracking all individuals in small or large collectives of unmarked animals. Nat. Methods 16, 179–182 (2019).

    CAS  PubMed  Google Scholar 

  15. Bozek, K., Hebert, L., Portugal, Y. & Stephens, G. J. Markerless tracking of an entire insect colony. Preprint at bioRxiv https://doi.org/10.1101/2020.03.26.007302 (2020).

  16. Karthik, S., Prabhu, A. & Gandhi, V. Simple unsupervised multi-object tracking. Preprint at arXiv https://arxiv.org/abs/2006.02609 (2020).

  17. Johansson, G. Visual perception of biological motion and a model for its analysis. Percept. Psychophys. 14, 201–211 (1973).

    Google Scholar 

  18. Marr, D. & Vaina, L. Representation and recognition of the movements of shapes. Proc. R. Soc. Lond. B Biol. Sci. 214, 501–524 (1982).

    CAS  PubMed  Google Scholar 

  19. O’Rourke, J. & Badler, N.I. Model-based image analysis of human motion using constraint propagation. in IEEE Transactions on Pattern Analysis and Machine Intelligence (PAMI-2), 522–536 (1980).

  20. Carreira, J., Agrawal, P., Fragkiadaki, K. & Malik, J. Human pose estimation with iterative error feedback. in Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) 4733–4742 (2016).

  21. Wei, S.-E., Ramakrishna, V., Kanade, T. & Sheikh, Y. Convolutional pose machines. in Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) 4724–4732 (2016).

  22. Newell, A., Yang, K. & Deng, J. Stacked hourglass networks for human pose estimation. in Computer Vision – ECCV 2016 483–499 (Springer, 2016).

  23. Lin, T.-Y. et al. Microsoft COCO: common objects in context. in Computer Vision – ECCV 2014 740–755 (Springer, 2014).

  24. Andriluka, M., Pishchulin, L., Gehler, P. & Schiele, B. 2D human pose estimation: new benchmark and state of the art analysis. in Proc. IEEE Conf. Computer Vision and Pattern Recognition 3686–3693 (2014).

  25. Ionescu, C., Papava, D., Olaru, V. & Sminchisescu, C. Human3.6M: large scale datasets and predictive methods for 3D human sensing in natural environments. IEEE Trans. Pattern Anal. Mach. Intell. 36, 1325–1339 (2014).

    PubMed  Google Scholar 

  26. Mathis, A. et al. DeepLabCut: markerless pose estimation of user-defined body parts with deep learning. Nat. Neurosci. 21, 1281–1289 (2018).

    CAS  PubMed  Google Scholar 

  27. Russakovsky, O. et al. ImageNet large scale visual recognition challenge. Int. J. Comput. Vis. 115, 211–252 (2015).

    Google Scholar 

  28. Kornblith, S., Shlens, J. & Le, Q. V. in Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) 2661–2671 (2019).

  29. He, K., Girshick, R. & Dollár, P. Rethinking ImageNet pre-training. in Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV) 4918–4927 (2019).

  30. Mathis, A., Yüksekgönül, M., Rogers, B., Bethge, M. & Mathis, M. W. Pretraining boosts out-of-domain robustness for pose estimation. Preprint at arXiv https://arxiv.org/abs/1909.11229 (2019).

  31. Pereira, T. D. et al. SLEAP: multi-animal pose tracking. Preprint at bioRxiv https://doi.org/10.1101/2020.08.31.276246 (2020).

  32. Pereira, T. D. et al. Fast animal pose estimation using deep neural networks. Nat. Methods 16, 117–125 (2019).

    CAS  PubMed  Google Scholar 

  33. Graving, J. M. et al. DeepPoseKit, a software toolkit for fast and robust animal pose estimation using deep learning. eLife 8, e47994 (2019).

    CAS  PubMed  PubMed Central  Google Scholar 

  34. He, K., Zhang, X., Ren, S. & Sun, J. Deep residual learning for image recognition. in Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) 770–778 (2016).

  35. Yu, F. et al. LSUN: construction of a large-scale image dataset using deep learning with humans in the loop. Preprint at arXiv https://arxiv.org/abs/1506.03365 (2015).

  36. Mathis, M. W. & Mathis, A. Deep learning tools for the measurement of animal behavior in neuroscience. Curr. Opin. Neurobiol. 60, 1–11 (2020).

    CAS  PubMed  Google Scholar 

  37. Christin, S., Hervet, É. & Lecomte, N. Applications for deep learning in ecology. Methods Ecol. Evol. 10, 1632–1644 (2019).

    Google Scholar 

  38. Suwajanakorn, S., Snavely, N., Tompson, J. J. & Norouzi, M. Discovery of latent 3D keypoints via end-to-end geometric reasoning. Adv. Neural Inf. Process. Syst. 31 2059–2070 (2018).

  39. Li, S. et al. Deformation-aware unpaired image translation for pose estimation on laboratory animals. in Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) 13158–13168 (2020).

  40. Mu, J., Qiu, W., Hager, G. & Yuille, A. Learning from synthetic animals. arXiv https://arxiv.org/abs/1912.08265 (2019).

  41. Cao, J. et al. Cross-domain adaptation for animal pose estimation. arXiv https://arxiv.org/abs/1908.05806 (2019).

  42. Liu, X. et al. OptiFlex: video-based animal pose estimation using deep learning enhanced by optical flow. Preprint at bioRxiv https://doi.org/10.1101/2020.04.04.025494 (2020).

  43. Wu, A., Buchanan, E.K., Whiteway, M. & Schartner, M. Deep Graph Pose: a semi-supervised deep graphical model for improved animal pose tracking. Preprint at bioRxiv https://doi.org/10.1101/2020.08.20.259705 (2020).

  44. Musall, S., Kaufman, M. T., Juavinett, A. L., Gluf, S. & Churchland, A. K. Single-trial neural dynamics are dominated by richly varied movements. Nat. Neurosci. 22, 1677–1686 (2019).

    CAS  PubMed  PubMed Central  Google Scholar 

  45. Günel, S. et al. DeepFly3D, a deep learning-based approach for 3D limb and appendage tracking in tethered, adult Drosophila. eLife 8, 640375 (2019).

    Google Scholar 

  46. Bala, P. C. et al. Automated markerless pose estimation in freely moving macaques with OpenMonkeyStudio. Nat. Commun. 11, 4560 (2020).

    CAS  PubMed  PubMed Central  Google Scholar 

  47. Garrido-Jurado, S., Muñoz-Salinas, R., Madrid-Cuevas, F. J. & Marín-Jiménez, M. J. Automatic generation and detection of highly reliable fiducial markers under occlusion. Pattern Recognit. 47, 2280–2292 (2014).

    Google Scholar 

  48. Karashchuk, P. et al. Anipose: a toolkit for robust markerless 3D pose estimation. Preprint at bioRxiv https://doi.org/10.1101/2020.05.26.117325 (2020).

  49. Ebbesen, C. L. & Froemke, R. C. Automatic tracking of mouse social posture dynamics by 3D videography, deep learning and GPU-accelerated robust optimization. Preprint at bioRxiv https://doi.org/10.1101/2020.05.21.109629 (2020).

  50. Storchi, R. et al. A high-dimensional quantification of mouse defensive behaviors reveals enhanced diversity and stimulus specificity. Curr. Biol. https://doi.org/10.1016/j.cub.2020.09.007 (2020)

  51. Chen, Y., Tian, Y. & He, M. Monocular human pose estimation: a survey of deep learning-based methods. Comput. Vis. Image Underst. 192, 102897 (2020).

    Google Scholar 

  52. Gosztolai, A. et al. LiftPose3D, a deep learning-based approach for transforming 2D to 3D pose in laboratory animals. Preprint at bioRxiv https://doi.org/10.1101/2020.09.18.292680 (2020).

  53. Tzouanas, C.N., Kim, S., Badhiwala, K.N., Avants, B.W. & Robinson, J.T. Stable behavioral and neural responses to thermal stimulation despite large changes in the Hydra vulgaris nervous system. Preprint at bioRxiv https://doi.org/10.1101/787648 (2020).

  54. Kearney, S., Li, W., Parsons, M., Kim, K. I. & Cosker, D. RGBD-Dog: predicting canine pose from RGBD sensors. in Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) 8336–8345 (2020).

  55. Zuffi, S., Kanazawa, A. & Black, M.J. Lions and tigers and bears: Capturing non-rigid, 3D, articulated shape from images. in Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition 3955–3963 (2018).

  56. Kulkarni, N., Gupta, A., Fouhey, D. F. & Tulsiani, S. Articulation-aware canonical surface mapping. in Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) 452–461 (2020).

  57. Badger, M. et al. 3D bird reconstruction: a dataset, model, and shape recovery from a single view. Preprint at arXiv https://arxiv.org/abs/2008.06133 (2020).

  58. Cao, Z., Simon, T., Wei, S.-E. & Sheikh, Y. Realtime multi-person 2D pose estimation using part affinity fields. in Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) 7291–7299 (2017).

  59. Francisco, F. A., Nührenberg, P. & Jordan, A. High-resolution animal tracking with integration of environmental information in aquatic systems. Preprint at bioRxiv https://doi.org/10.1101/2020.02.25.963926 (2020).

  60. Xiao, B., Wu, H. & Wei, Y. Simple baselines for human pose estimation and tracking. in Proceedings of the European Conference on Computer Vision (ECCV) 466–481 (2018).

  61. Jiang, Z. et al. Detection and tracking of multiple mice using part proposal networks. Preprint at arXiv https://arxiv.org/abs/1906.02831 (2019).

  62. Wang, M., Tighe, J. & Modolo, D. Combining detection and tracking for human pose estimation in videos. in Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) 11088–11096 (2020).

  63. Jin, S., Liu, W., Ouyang, W. & Qian, C. Multi-person articulated tracking with spatial and temporal embeddings. in Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) 5664–5673 (2019).

  64. Raaj, Y., Idrees, H., Hidalgo, G. & Sheikh, Y. Efficient online multi-person 2d pose tracking with recurrent spatio-temporal affinity fields. in Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) 4620–4628 (2019).

  65. Datta, S. R., Anderson, D. J., Branson, K., Perona, P. & Leifer, A. Computational neuroethology: a call to action. Neuron 104, 11–24 (2019).

    CAS  PubMed  PubMed Central  Google Scholar 

  66. Kabra, M., Robie, A. A., Rivera-Alba, M., Branson, S. & Branson, K. JAABA: interactive machine learning for automatic annotation of animal behavior. Nat. Methods 10, 64–67 (2013).

    CAS  PubMed  Google Scholar 

  67. Hong, W. et al. Automated measurement of mouse social behaviors using depth sensing, video tracking, and machine learning. Proc. Natl Acad. Sci. USA 112, E5351–E5360 (2015).

    CAS  PubMed  Google Scholar 

  68. Nilsson, S. R. O. et al. Simple Behavioral Analysis (SimBA) – an open source toolkit for computer classification of complex social behaviors in experimental animals. Preprint at bioRxiv https://doi.org/10.1101/2020.04.19.049452 (2020).

  69. Ren, B., Liu, M., Ding, R. & Liu, H. A survey on 3D skeleton-based action recognition using learning method. Preprint at arXiv https://arxiv.org/abs/2002.05907 (2020).

  70. Levitis, D. A., Lidicker, W. Z. & Freund, G. Behavioural biologists don’t agree on what constitutes behaviour. Anim. Behav. 78, 103–110 (2009).

    PubMed  PubMed Central  Google Scholar 

  71. Szigeti, B., Stone, T. & Webb, B. Inconsistencies in C. elegans behavioural annotation. Preprint at bioRxiv https://doi.org/10.1101/066787 (2016).

  72. Leng, X., Wohl, M., Ishii, K., Nayak, P. & Asahina, K. Quantitative comparison of Drosophila behavior annotations by human observers and a machine learning algorithm. Preprint at bioRxiv https://doi.org/10.1101/2020.06.16.153130 (2020).

  73. Brown, A. E. X. & de Bivort, B. Ethology as a physical science. Nat. Phys. 14, 653–657 (2018).

    CAS  Google Scholar 

  74. Berman, G. J. Measuring behavior across scales. BMC Biol. 16, 23 (2018).

    PubMed  PubMed Central  Google Scholar 

  75. Berman, G. J., Choi, D. M., Bialek, W. & Shaevitz, J. W. Mapping the stereotyped behaviour of freely moving fruit flies. J. R. Soc. Interface 11, 20140672 (2014).

    PubMed  PubMed Central  Google Scholar 

  76. Todd, J. G., Kain, J. S. & de Bivort, B. L. Systematic exploration of unsupervised methods for mapping behavior. Phys. Biol. 14, 015002 (2017).

    PubMed  Google Scholar 

  77. Klaus, A. et al. The spatiotemporal organization of the striatum encodes action space. Neuron 95, 1171–1180.e7 (2017).

    CAS  PubMed  PubMed Central  Google Scholar 

  78. Marques, J. C., Lackner, S., Félix, R. & Orger, M. B. Structure of the zebrafish locomotor repertoire revealed with unsupervised behavioral clustering. Curr. Biol. 28, 181–195.e5 (2018).

    CAS  PubMed  Google Scholar 

  79. Hsu, A.I. & Yttri, E.A. B-SOiD: an open source unsupervised algorithm for discovery of spontaneous behaviors. Preprint at bioRxiv https://doi.org/10.1101/770271 (2020).

  80. Zimmermann, C., Schneider, A., Alyahyay, M., Brox, T. & Diester, I. FreiPose: a deep learning framework for precise animal motion capture in 3D spaces. Preprint at bioRxiv https://doi.org/10.1101/2020.02.27.967620 (2020).

  81. Wiltschko, A. B. et al. Mapping sub-second structure in mouse behavior. Neuron 88, 1121–1135 (2015).

    CAS  PubMed  PubMed Central  Google Scholar 

  82. Linderman, S. et al. Bayesian learning and inference in recurrent switching linear dynamical systems. in Proceedings of the 20th International Conference on Artificial Intelligence and Statistics (eds. Singh, A. & Zhu, J.) 54, 914–922 (PMLR, 2017).

  83. Costa, A. C., Ahamed, T. & Stephens, G. J. Adaptive, locally linear models of complex dynamics. Proc. Natl Acad. Sci. USA 116, 1501–1510 (2019).

    CAS  PubMed  Google Scholar 

  84. Vogelstein, J. T. et al. Discovery of brainwide neural-behavioral maps via multiscale unsupervised structure learning. Science 344, 386–392 (2014).

    CAS  PubMed  Google Scholar 

  85. Berman, G. J., Bialek, W. & Shaevitz, J. W. Predictability and hierarchy in Drosophila behavior. Proc. Natl Acad. Sci. USA 113, 11943–11948 (2016).

    CAS  PubMed  Google Scholar 

  86. Tao, L., Ozarkar, S., Beck, J. M. & Bhandawat, V. Statistical structure of locomotion and its modulation by odors. eLife 8, e41235 (2019).

    PubMed  PubMed Central  Google Scholar 

  87. Ligon, R.A., Scholes, E. & Sheehan, M.J. RAD-Behavior (Recombining Atomized, Discretized, Behavior): a new framework for the quantitative analysis of behavioral execution. Preprint at bioRxiv https://doi.org/10.1101/739151 (2019).

  88. Gupta, S. & Gomez-Marin, A. A context-free grammar for Caenorhabditis elegans behavior. Preprint at bioRxiv https://doi.org/10.1101/708891 (2019).

  89. Stephens, G. J., Johnson-Kerner, B., Bialek, W. & Ryu, W. S. Dimensionality and dynamics in the behavior of C. elegans. PLOS Comput. Biol. 4, e1000028 (2008).

    PubMed  PubMed Central  Google Scholar 

  90. Mearns, D. S., Donovan, J. C., Fernandes, A. M., Semmelhack, J. L. & Baier, H. Deconstructing hunting behavior reveals a tightly coupled stimulus-response loop. Curr. Biol. 30, 54–69.e9 (2020).

    CAS  PubMed  Google Scholar 

  91. DeAngelis, B. D., Zavatone-Veth, J. A. & Clark, D. A. The manifold structure of limb coordination in walking Drosophila. eLife 8, e46409 (2019).

    PubMed  PubMed Central  Google Scholar 

  92. Su, K., Liu, X. & Shlizerman, E. PREDICT & CLUSTER: unsupervised skeleton based action recognition. in Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) 9631–9640 (2020).

  93. Graving, J. M. & Couzin, I. D. VAE-SNE: a deep generative model for simultaneous dimensionality reduction and clustering. Preprint at bioRxiv https://doi.org/10.1101/2020.07.17.207993 (2020).

  94. Johnson, M. J., Duvenaud, D. K., Wiltschko, A., Adams, R. P. & Datta, S. R. Composing graphical models with neural networks for structured representations and fast inference. Adv. Neural Inf. Process. Syst. 29, 2946–2954 (2016).

    Google Scholar 

  95. Luxem, K., Fuhrmann, F., Kürsch, J., Remy, S. & Bauer, P. Identifying behavioral structure from deep variational embeddings of animal motion. Preprint at bioRxiv https://doi.org/10.1101/2020.05.14.095430 (2020).

  96. Harvey, C. D., Collman, F., Dombeck, D. A. & Tank, D. W. Intracellular dynamics of hippocampal place cells during virtual navigation. Nature 461, 941–946 (2009).

    CAS  PubMed  PubMed Central  Google Scholar 

  97. Stowers, J. R. et al. Virtual reality for freely moving animals. Nat. Methods 14, 995–1002 (2017).

    CAS  PubMed  PubMed Central  Google Scholar 

  98. Haberkern, H. et al. Visually guided behavior and optogenetically induced learning in head-fixed flies exploring a virtual landscape. Curr. Biol. 29, 1647–1659.e8 (2019).

    CAS  PubMed  Google Scholar 

  99. Naik, H., Bastien, R., Navab, N. & Couzin, I. D. Animals in virtual environments. IEEE Trans. Vis. Comput. Graph. 26, 2073–2083 (2020).

    PubMed  Google Scholar 

  100. Robie, A. A. et al. Mapping the neural substrates of behavior. Cell 170, 393–406.e28 (2017).

    CAS  PubMed  Google Scholar 

  101. Cande, J. et al. Optogenetic dissection of descending behavioral control in Drosophila. eLife 7, e34275 (2018).

    PubMed  PubMed Central  Google Scholar 

  102. Mimica, B., Dunn, B. A., Tombaz, T., Bojja, V. P. T. N. C. S. & Whitlock, J. R. Efficient cortical coding of 3D posture in freely behaving rats. Science 362, 584–589 (2018).

    CAS  PubMed  Google Scholar 

  103. Markowitz, J. E. et al. The striatum organizes 3D behavior via moment-to-moment action selection. Cell 174, 44–58.e17 (2018).

    CAS  PubMed  PubMed Central  Google Scholar 

  104. Marques, J. C., Li, M., Schaak, D., Robson, D. N. & Li, J. M. Internal state dynamics shape brainwide activity and foraging behaviour. Nature 577, 239–243 (2020).

    CAS  PubMed  Google Scholar 

  105. Kaplan, H. S., Salazar Thula, O., Khoss, N. & Zimmer, M. Nested neuronal dynamics orchestrate a behavioral hierarchy across timescaleS. Neuron 105, 562–576.e9 (2020).

    CAS  PubMed  PubMed Central  Google Scholar 

  106. Zhang, W. & Yartsev, M. M. Correlated neural activity across the brains of socially interacting bats. Cell 178, 413–428.e22 (2019).

    CAS  PubMed  PubMed Central  Google Scholar 

  107. Kingsbury, L. et al. Correlated neural activity and encoding of behavior across brains of socially interacting animals. Cell 178, 429–446.e16 (2019).

    CAS  PubMed  PubMed Central  Google Scholar 

  108. Klibaite, U. & Shaevitz, J. W. Paired fruit flies synchronize behavior: uncovering social interactions in Drosophila melanogaster. PLoS Comput. Biol. 16, e1008230 (2020)

  109. Gepner, R., Mihovilovic Skanata, M., Bernat, N. M., Kaplow, M. & Gershow, M. Computations underlying Drosophila photo-taxis, odor-taxis, and multi-sensory integration. eLife 4, e06229 (2015).

    PubMed Central  Google Scholar 

  110. Calhoun, A. J., Pillow, J. W. & Murthy, M. Unsupervised identification of the internal states that shape natural behavior. Nat. Neurosci. 22, 2040–2049 (2019).

    CAS  PubMed  Google Scholar 

  111. Maesani, A. et al. Fluctuation-driven neural dynamics reproduce Drosophila locomotor patterns. PLOS Comput. Biol. 11, e1004577 (2015).

    PubMed  PubMed Central  Google Scholar 

  112. Kim, J., Santos, J.A., Alkema, M.J. & Shlizerman, E. Whole integration of neural connectomics, dynamics and bio-mechanics for identification of behavioral sensorimotor pathways in Caenorhabditis elegans. Preprint at bioRxiv https://doi.org/10.1101/724328 (2019).

  113. Merel, J. et al. Deep neuroethology of a virtual rodent. in International Conference on Learning Representations https://openreview.net/forum?id=SyxrxR4KPS (2020).

  114. Crosby, M., Beyret, B. & Halina, M. The Animal-AI Olympics. Nature Machine Intelligence 1, 257 (2019).

    Google Scholar 

  115. Eyjolfsdottir, E., Branson, K., Yue, Y. & Perona, P. Learning recurrent representations for hierarchical behavior modeling. in International Conference on Learning Representations https://openreview.net/forum?id=BkLhzHtlg (2017).

  116. Teng, M., Le, T. A., Scibior, A. & Wood, F. Imitation learning of factored multi-agent reactive models. Preprint at arXiv https://arxiv.org/abs/1903.04714 (2019).

  117. Dorkenwald, S. et al. FlyWire: online community for whole-brain connectomics. Preprint at bioRxiv https://doi.org/10.1101/2020.08.30.274225 (2020).

  118. Michaels, J. A., Schaffelhofer, S., Agudelo-Toro, A. & Scherberger, H. A modular neural network model of grasp movement generation. Preprint at bioRxiv https://doi.org/10.1101/742189 (2020).

  119. Scidraw. Mouse top https://doi.org/10.5281/zenodo.3925916 (2020).

  120. BioRender. https://biorender.com (2020).

  121. Segalin, C. et al. The Mouse Action Recognition System (MARS): a software pipeline for automated analysis of social behaviors in mice. Preprint at bioRxiv https://doi.org/10.1101/2020.07.26.222299 (2020).

  122. Klibaite, U., Berman, G. J., Cande, J., Stern, D. L. & Shaevitz, J. W. An unsupervised method for quantifying the behavior of paired animals. Phys. Biol. 14, 015006 (2017).

    PubMed  PubMed Central  Google Scholar 

  123. Stringer, C. et al. Spontaneous behaviors drive multidimensional, brainwide activity. Science 364, 255 (2019).

    PubMed  PubMed Central  Google Scholar 

  124. Batty, E. et al. BehaveNet: nonlinear embedding and Bayesian neural decoding of behavioral videos. Adv. Neural Inform. Process. Syst. 32, 15706–15717 (2019).

    Google Scholar 

  125. Dolensek, N., Gehrlach, D. A., Klein, A. S. & Gogolla, N. Facial expressions of emotion states and their neuronal correlates in mice. Science 368, 89–94 (2020).

    CAS  PubMed  Google Scholar 

  126. Gupta, P. et al. Quo vadis, skeleton action recognition? Preprint at arXiv https://arxiv.org/abs/2007.02072 (2020).

  127. Carreira, J. & Zisserman, A. Quo vadis, action recognition? A new model and the kinetics dataset. in Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) 6299–6308 (2017).

  128. Ravbar, P., Branson, K. & Simpson, J. H. An automatic behavior recognition system classifies animal behaviors using movements and their temporal context. J. Neurosci. Methods 326, 108352 (2019).

    PubMed  PubMed Central  Google Scholar 

  129. Bohnslav, J. P. et al. DeepEthogram: a machine learning pipeline for supervised behavior classification from raw pixels. Preprint at bioRxiv https://doi.org/10.1101/2020.09.24.312504 (2020).

  130. Arthur, B. J., Sunayama-Morita, T., Coen, P., Murthy, M. & Stern, D. L. Multi-channel acoustic recording and automated analysis of Drosophila courtship songs. BMC Biol. 11, 11 (2013).

    PubMed  PubMed Central  Google Scholar 

  131. Pearre, B., Perkins, L. N., Markowitz, J. E. & Gardner, T. J. A fast and accurate zebra finch syllable detector. PLoS One 12, e0181992 (2017).

  132. Van Segbroeck, M., Knoll, A. T., Levitt, P. & Narayanan, S. MUPET-Mouse Ultrasonic Profile ExTraction: a signal processing tool for rapid and unsupervised analysis of ultrasonic vocalizations. Neuron 94, 465–485.e5 (2017).

    PubMed  PubMed Central  Google Scholar 

  133. Sangiamo, D. T., Warren, M. R. & Neunuebel, J. P. Ultrasonic signals associated with different types of social behavior of mice. Nat. Neurosci. 23, 411–422 (2020).

    CAS  PubMed  PubMed Central  Google Scholar 

  134. Coen, P. et al. Dynamic sensory cues shape song structure in Drosophila. Nature 507, 233–237 (2014).

    CAS  PubMed  Google Scholar 

  135. Coffey, K. R., Marx, R. G. & Neumaier, J. F. DeepSqueak: a deep learning-based system for detection and analysis of ultrasonic vocalizations. Neuropsychopharmacology 44, 859–868 (2019).

    PubMed  PubMed Central  Google Scholar 

  136. Fonseca, A. H. O., Santana, G. M., Bampi, S. & Dietrich, M. O. Analysis of ultrasonic vocalizations from mice using computer vision and machine learning. Preprint at bioRxiv https://doi.org/10.1101/2020.05.20.105023 (2020).

  137. Cohen, Y., Nicholson, D. A. & Gardner, T. J. TweetyNet: a neural network that enables high-throughput, automated annotation of birdsong. Preprint at bioRxiv https://doi.org/10.1101/2020.08.28.272088 (2020).

  138. Clemens, J. et al. Discovery of a new song mode in Drosophila reveals hidden structure in the sensory and neural drivers of behavior. Curr. Biol. 28, 2400–2412.e6 (2018).

    CAS  PubMed  PubMed Central  Google Scholar 

  139. Mackevicius, E. L. et al. Unsupervised discovery of temporal sequences in high-dimensional datasets, with applications to neuroscience. eLife 8, e38471 (2019).

    PubMed  PubMed Central  Google Scholar 

  140. Tabler, J. M. et al. Cilia-mediated Hedgehog signaling controls form and function in the mammalian larynx. eLife 6, e19153 (2017).

    PubMed  PubMed Central  Google Scholar 

  141. Sainburg, T., Thielk, M. & Gentner, T.Q. Latent space visualization, characterization, and generation of diverse vocal communication signals. Preprint at bioRxiv https://doi.org/10.1101/870311 (2019).

  142. Goffinet, J., Mooney, R. & Pearson, J. Inferring low-dimensional latent descriptions of animal vocalizations. Preprint at bioRxiv https://doi.org/10.1101/811661 (2019).

Download references

Acknowledgements

T.D.P. is supported by NSF GRFP (DGE-1148900) and the Princeton Porter Ogden Jacobus Fellowship. M.M. and J.W.S. are supported by an NIH BRAIN Initiative R01 (R01 NS104899) and an NSF Physics Frontier Center grant (NSF PHY-1734030). M.M. is also supported by an HHMI Faculty Scholar award and an NIH NINDS R35 research program award. We thank B. Cowley and J. Pillow for very helpful comments and suggestions.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Mala Murthy.

Ethics declarations

Competing interests

The authors declare no competing interests.

Additional information

Peer review information Nature Neuroscience thanks Ann Kennedy, Pavan Ramdya, and the other, anonymous, reviewer(s) for their contribution to the peer review of this work.

Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Pereira, T.D., Shaevitz, J.W. & Murthy, M. Quantifying behavior to understand the brain. Nat Neurosci 23, 1537–1549 (2020). https://doi.org/10.1038/s41593-020-00734-z

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1038/s41593-020-00734-z

This article is cited by

Search

Quick links

Nature Briefing

Sign up for the Nature Briefing newsletter — what matters in science, free to your inbox daily.

Get the most important science stories of the day, free in your inbox. Sign up for Nature Briefing