Abstract
Accurate tracking and analysis of animal behavior is crucial for modern systems neuroscience. However, following freely moving animals in naturalistic, three-dimensional (3D) or nocturnal environments remains a major challenge. Here, we present EthoLoop, a framework for studying the neuroethology of freely roaming animals. Combining real-time optical tracking and behavioral analysis with remote-controlled stimulus-reward boxes, this system allows direct interactions with animals in their habitat. EthoLoop continuously provides close-up views of the tracked individuals and thus allows high-resolution behavioral analysis using deep-learning methods. The behaviors detected on the fly can be automatically reinforced either by classical conditioning or by optogenetic stimulation via wirelessly controlled portable devices. Finally, by combining 3D tracking with wireless neurophysiology we demonstrate the existence of place-cell-like activity in the hippocampus of freely moving primates. Taken together, we show that the EthoLoop framework enables interactive, well-controlled and reproducible neuroethological studies in large-field naturalistic settings.
This is a preview of subscription content, access via your institution
Relevant articles
Open Access articles citing this article.
-
Spontaneous behaviour is structured by reinforcement without explicit reward
Nature Open Access 18 January 2023
-
A paradigm shift in translational psychiatry through rodent neuroethology
Molecular Psychiatry Open Access 12 January 2023
-
Neural Correlates of Spatial Navigation in Primate Hippocampus
Neuroscience Bulletin Open Access 02 November 2022
Access options
Access Nature and 54 other Nature Portfolio journals
Get Nature+, our best-value online-access subscription
$29.99 / 30 days
cancel any time
Subscribe to this journal
Receive 12 print issues and online access
$209.00 per year
only $17.42 per issue
Rent or buy this article
Get just this article for as long as you need it
$39.95
Prices may be subject to local taxes which are calculated during checkout






Data availability
The MRI Atlas for the mouse lemur is available at https://www.nitrc.org/projects/mouselemuratlas. Source data for Figs. 1 and 3–6 as well as source data for Extended Data Figs. 1–3, 6, 7 and 10 are provided. The original raw data containing all 3D trajectories, feedback signals and electrophysiology data is only available upon request due to the large file sizes. Source data are provided with this paper.
Code availability
All the codes for tracking (GPU units and host machine), RECO boxes and real-time labeling of body parts are available at www.huberlab.org/EthoLoop/software. R codes used for analysis are available upon request.
References
Tinbergen, N The Study of Instinct 195 (Clarendon Press, 1951).
Krakauer, J. W., Ghazanfar, A. A., Gomez-Marin, A., MacIver, M. A. & Poeppel, D. Neuroscience needs behavior: correcting a reductionist bias. Neuron 93, 480–490 (2017).
Huber, F. & Markl, H. (eds) Neuroethology and Behavioral Physiology: Roots and Growing Points (Springer, 1983).
Hölscher, C., Schnee, A., Dahmen, H., Setia, L. & Mallot, H. A. Rats are able to navigate in virtual environments. J. Exp. Biol. 208, 561–569 (2005).
Dombeck, D. A., Harvey, C. D., Tian, L., Looger, L. L. & Tank, D. W. Functional imaging of hippocampal place cells at cellular resolution during virtual navigation. Nat. Neurosci. 13, 1433–1440 (2010).
Keller, G. B., Bonhoeffer, T. & Hübener, M. Sensorimotor mismatch signals in primary visual cortex of the behaving mouse. Neuron 74, 809–815 (2012).
Harvey, C. D., Collman, F., Dombeck, D. A. & Tank, D. W. Intracellular dynamics of hippocampal place cells during virtual navigation. Nature 461, 941–946 (2009).
Minderer, M., Harvey, C. D., Donato, F. & Moser, E. I. Neuroscience: virtual reality explored. Nature 533, 324–325 (2016).
Aghajan, Z. M. et al. Impaired spatial selectivity and intact phase precession in two-dimensional virtual reality. Nat. Neurosci. 18, 121–128 (2015).
Aronov, D. & Tank, D. W. Engagement of neural circuits underlying 2D spatial navigation in a rodent virtual reality system. Neuron 84, 442–456 (2014).
Stowers, J. R. et al. Virtual reality for freely moving animals. Nat. Methods 14, 995–1002 (2017).
Dombeck, D. A., Khabbaz, A. N., Collman, F., Adelman, T. L. & Tank, D. W. Imaging large-scale neural activity with cellular resolution in awake, mobile mice. Neuron 56, 43–57 (2007).
Sofroniew, N. J., Cohen, J. D., Lee, A. K. & Svoboda, K. Natural whisker-guided behavior by head-fixed mice in tactile virtual reality. J. Neurosci. 34, 9537–9550 (2014).
Radvansky, B. A. & Dombeck, D. A. An olfactory virtual reality system for mice. Nat. Commun. 9, 839 (2018).
Fry, S. N., Bichsel, M., Müller, P. & Robert, D. Tracking of flying insects using pan-tilt cameras. J. Neurosci. Methods 101, 59–67 (2000).
Fry, S. N., Rohrseitz, N., Straw, A. D. & Dickinson, M. H. TrackFly: virtual reality for a behavioral system analysis in free-flying fruit flies. J. Neurosci. Methods 171, 110–117 (2008).
Straw, A. D., Branson, K., Neumann, T. R. & Dickinson, M. H. Multi-camera real-time three-dimensional tracking of multiple flying animals. J. R. Soc. Interface 8, 395–409 (2011).
Bath, D. E. et al. FlyMAD: rapid thermogenetic control of neuronal activity in freely walking Drosophila. Nat. Methods 11, 756–762 (2014).
Romero-Ferrero, F., Bergomi, M. G., Hinz, R. C., Heras, F. J. H. & de Polavieja, G. G. idtracker.ai: tracking all individuals in small or large collectives of unmarked animals. Nat. Methods 16, 179–182 (2019).
Weissbrod, A. et al. Automated long-term tracking and social behavioural phenotyping of animal colonies within a semi-natural environment. Nat. Commun. 4, 2018 (2013).
de Chaumont, F. et al. Computerized video analysis of social interactions in mice. Nat. Methods 9, 410–417 (2012).
Matsumoto, J. et al. A 3D-video-based computerized analysis of social and sexual interactions in rats. PLoS ONE 8, e78460 (2013).
Ballesta, S., Reymond, G., Pozzobon, M. & Duhamel, J.-R. A real-time 3D video tracking system for monitoring primate groups. J. Neurosci. Methods 234, 147–152 (2014).
Khan, Z., Herman, R. A., Wallen, K. & Balch, T. An outdoor 3-D visual tracking system for the study of spatial navigation and memory in rhesus monkeys. Behav. Res. Methods 37, 453–463 (2005).
Tsoar, A. et al. Large-scale navigational map in a mammal. Proc. Natl Acad. Sci. USA 108, E718–E724 (2011).
Yartsev, M. M. & Ulanovsky, N. Representation of three-dimensional space in the hippocampus of flying bats. Science 340, 367–372 (2013).
Hong, W. et al. Automated measurement of mouse social behaviors using depth sensing, video tracking, and machine learning. Proc. Natl Acad. Sci. USA 112, E5351–E5360 (2015).
Shemesh, Y. et al. Correction: high-order social interactions in groups of mice. eLife 3, e03602 (2014).
De Chaumont, F. et al. Real-time analysis of the behaviour of groups of mice via a depth-sensing camera and machine learning. Nat. Biomed. Eng. 3, 930–942 (2019).
Pérez-Escudero, A., Vicente-Page, J., Hinz, R. C., Arganda, S. & de Polavieja, G. G. idTracker: tracking individuals in a group by automatic identification of unmarked animals. Nat. Methods 11, 743–748 (2014).
Strauss, R., Schuster, S. & Götz, K. G. Processing of artificial visual feedback in the walking fruit fly Drosophila melanogaster. J. Exp. Biol. 200, 1281–1296 (1997).
Del Grosso, N. A., Graboski, J. J., Chen, W., Blanco-Hernández, E. & Sirota, A. Virtual reality system for freely-moving rodents. Preprint at bioRxiv https://doi.org/10.1101/161232 (2017).
Buccino, A. P. et al. Open source modules for tracking animal behavior and closed-loop stimulation based on Open Ephys and Bonsai. J. Neural Eng. 15, 055002 (2018).
Lim, J. & Celikel, T. Real-time contextual feedback for close-loop control of navigation. J. Neural Eng. 16, https://iopscience.iop.org/article/10.1088/1741-2552/ab2ffa (2019).
Stephens, D. W., Brown, J. S. & Ydenberg, R. C. Foraging: Behavior and Ecology (University of Chicago Press, 2008).
Krebs, J. R. & Davies, N. B. Behavioural Ecology: An Evolutionary Approach (John Wiley & Sons, 2009).
Silcox, M. T. & López-Torres, S. Major questions in the study of primate origins. Ann. Rev. Earth Planet. Sci. 45, 113–137 (2017).
Grobéty, M.-C. & Schenk, F. Spatial learning in a three-dimensional maze. Anim. Behav. 43, 1011–1020 (1992).
Jovalekic, A. et al. Horizontal biases in rats’ use of three-dimensional space. Behav. Brain Res. 222, 279–288 (2011).
Skinner, B. F. The Behaviour of Organisms (D. Appleton and Co., 1938).
Breland, K. & Breland, M. A field of applied animal psychology. Am. Psychol. 6, 202–204 (1951).
Pryor, K. Don’t Shoot the Dog! The New Art of Teaching and Training revised edn (Bantam Books, 1999).
Wiltschko, A. B. et al. Mapping sub-second structure in mouse behavior. Neuron 88, 1121–1135 (2015).
Mathis, A. et al. DeepLabCut: markerless pose estimation of user-defined body parts with deep learning. Nat. Neurosci. 21, 1281–1289 (2018).
Tsai, H.-C. et al. Phasic firing in dopaminergic neurons is sufficient for behavioral conditioning. Science 324, 1080–1084 (2009).
O’keefe, J. & Nadel, L. The Hippocampus as a Cognitive Map (Clarendon Press, 1978).
Wilson, M. A. & McNaughton, B. L. Dynamics of the hippocampal ensemble code for space. Science 261, 1055–1058 (1993).
Ulanovsky, N. Neuroscience: how is three-dimensional space encoded in the brain? Curr. Biol. 21, R886–R888 (2011).
Finkelstein, A. et al. Three-dimensional head-direction coding in the bat brain. Nature 517, 159–164 (2015).
Pereira, T. D. et al. Fast animal pose estimation using deep neural networks. Nat. Methods 16, 117–125 (2019).
Gershenfeld, N., Krikorian, R. & Cohen, D. The internet of things. Sci. Am. 291, 76–81 (2004).
Perret, M., Gomez, D., Barbosa, A., Aujard, F. & Théry, M. Increased late night response to light controls the circadian pacemaker in a nocturnal primate. J. Biol. Rhythms 25, 186–196 (2010).
Perret, M. Change in photoperiodic cycle affects life span in a prosimian primate (Microcebus murinus). J. Biol. Rhythms 12, 136–145 (1997).
Guo, Z. V. et al. Procedures for behavioral experiments in head-fixed mice. PLoS ONE 9, e88678 (2014).
Harltey, A. & Zisserman, A. Multiple View Geometry in Computer Vision (Cambridge Univ. Press, 2006).
Hartley, R. I. & Sturm, P. Triangulation. Comput. Vis. Image Underst. 68, 146–157 (1997).
Nistér, D. An efficient solution to the five-point relative pose problem. IEEE Trans. Pattern Anal. Mach. Intell. 26, 756–777 (2004).
Zach, C. Robust bundle adjustment revisited. In Proc. Computer Vision – ECCV 2014 772–787 (Springer, 2014).
Bons, N. A Stereotaxic Atlas of the Grey Lesser Mouse Lemur Brain (Microcebus murinus) (Elsevier, 1998).
Dhenain, M., Ruffins, S. W. & Jacobs, R. E. Three-dimensional digital mouse atlas using high-resolution MRI. Dev. Biol. 232, 458–470 (2001).
Nadkarni, N. A., Bougacha, S., Garin, C., Dhenain, M. & Picq, J.-L. Digital templates and brain atlas dataset for the mouse lemur primate. Data Brief 21, 1178–1185 (2018).
Acknowledgements
We express our gratitude to the members of the Huber laboratory for their support and discussions. We thank M. Perret and the animal caretakers at the Brunoy facility for their help with mouse lemur breeding, care and handling, and H. Clamouze for constructing the maze. M. Scharter for advice with implementing DeepLabCut. We thank P. Yger for advice on spike sorting, L. Slimani and N. Liaudet for their help with the reconstruction of the electrode position. We thank Varta Microbattery for supplying the CP 1254 A3 coin batteries. We thank M. Prsa, G. Galiñanes and M. Long for their comments on the manuscript. This work was supported by the Vahabzadeh Foundation (A.N.), Human Frontiers Science Program (D.H. and F.P., grant no. RGP0024/2016), the New York Stem Cell Foundation (D.H.), Swiss National Science Foundation/Deutsche Forschungsgemeinschaft (D.H., grant no. 310030E_190060). D.H. is a New York Stem Cell Foundation-Robertson Investigator.
Author information
Authors and Affiliations
Contributions
A.N. and D.H. conceptualized the EthoLoop system. A.N. and D.H. designed the tracking and close-up hardware. A.N. wrote all software for acquisition, tracking, close-loop control and RECO boxes. A.N. ran all experiments and analyzed all the data. D.H. oversaw data analysis. A.N., D.H. and S.P. designed the RECO box. R.Z. carried out the surgeries and ran the electrophysiological recording experiments. A.H. reconstructed the electrode trajectories. C.P.S. provided the mice and advice for the optogenetic experiments. Y.O. participated in the limb tracking. F.P. and F.A. provided expertise, guidance and resources related to the mouse lemurs. A.H. provided the 3D lattice maze and advice regarding the computer tomography scan. G.R. provided expertise for the tracking software. D.H. and A.N. wrote the manuscript.
Corresponding author
Ethics declarations
Competing interests
The authors declare no competing interests.
Additional information
Peer review information Nina Vogt was the primary editor on this article and managed its editorial process and peer review in collaboration with the rest of the editorial team.
Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Extended data
Extended Data Fig. 1 Foraging mouse lemur in the lattice maze with 5 RECO-boxes.
a, Five RECO-boxes (colored cubes) were spatially distributed at different locations of the lattice maze. RECO-boxes were activated in different sequences. b,c, Ten minutes of 3D trajectory of the mouse lemur where the RECO-boxes are activated in a circular (b) or non-circular order (c). The trajectory color is based on the currently activated RECO-box. e,f, Speed of locomotion (color-coded) under dim light (e) and complete darkness (f). g,h, The number of jumps (g) (12 time intervals for 1 mouse lemur, two-sided Wilcoxon signed rank test, *P=0.0207) as well as the overall speed (h) was reduced in the absence of light (blue) compared to the dimly lit condition (red, two-sample Kolmogorov-Smirnov test, two sided, ***P< 2.210-16). The boxplot in (g) represents the upper and lower quartiles as well as the median.
Extended Data Fig. 2 Mouse place-conditioning in naturalistic conditions.
a, The behavioral arena was filled with tree branches and included a RECO-box (yellow cube). Mice were conditioned to visit one of three unmarked locations (colored spheres). b, Number of entries to the conditioned place 1 and 2 during each session for all mice (N=4). Bars represent standard error of mean. c–f, 3D trajectories of the beginning, middle and end of all sessions for all four mice that underwent 3D place conditioning. Trajectories of Day 3 for Mouse N3 shown in Fig. 3b is repeated here. Three different conditioned locations are shown in green, red and purple spheres. The time to reach the criteria condition (50th entrance to the conditioned place) is noted on the right.
Extended Data Fig. 3 Mouse place-conditioning in an open-field arena.
a, A flat open-field arena for freely moving mice containing a single RECO-box and differently shaped objects placed at the four corners. The mice were trained to visit one of the four corners of the arena. Every session a different corner was reinforced. b, Number of entries to the conditioned place for all four days of training. For all days the mice entered the conditioned place more frequently (N= 6 mice; one-way repeated measure ANOVA; Two-sample pairwise t-test; Bonferroni correction for multiple comparison; beginning versus end; Day 1: ** P=0.009, Day 2: ** P=0.008; Day 3: *** P=2.910-7; Day 4: P=0.006). Error bars represent SEM. c–h, Original tracking traces of the beginning, middle and end of all sessions (four corners - illustrated with colored circles) for all six mice that underwent place conditioning. The time to reach the criteria condition (50th entry) is noted on the right.
Extended Data Fig. 4 Trajectory of a mouse lemur during a place-conditioning experiment in the lattice maze.
a, Traces of a second conditioning experiment of a mouse lemur in the 3D lattice maze at the beginning, middle, and end of a behavioral session. The blue sphere indicates the location which was conditioned. b, The number of entries to the conditioned place across a behavioral session. c, Entries to the conditioned place across the session.
Extended Data Fig. 5 Mouse operant-conditioning of automatically detected behavior in naturalistic environments.
a–c, Additional 3D trajectories of three mice during conditioning of automatically detected postures. The time to reach the criteria condition (50th rearing) is noted on the right.
Extended Data Fig. 6 Mouse operant-conditioning of automatically detected behavior in an open-field arena.
a, Open-field arena for mice with an elevated platform and a RECO-box. The color code indicates the height of the tracked marker on the neck of the mouse. b, Summary data from 3 mice across the sessions (N=3 mice). c–e, Trajectories of three mice during condition sessions where the number of rearings increased. The time to reach the criteria condition (50th rearing) is noted on the right.
Extended Data Fig. 7 Place-conditioning using wireless and tethered optogenetic stimulation of VTA.
a, The red and green spheres indicate the conditioning locations in a naturalistic arena for mice using wireless optogenetics. b, Summary data of first day place conditioning session using wireless (3D, 2 mice) and tethered (2D, 2 mice) optogenetic VTA stimulation. The data from the wireless and tethered optogenetic were grouped together based on the day the place conditioning experiments were carried out. Mice were naive on the first day. The number of entries to the conditioned place increased within a session (N= 4 mice; one-way repeated measure ANOVA; F(2,6)=10.916; main effect P=0.01; pairwise two-sample t test; Bonferroni correction for multiple comparison; beginning versus end *P=0.025; beginning versus middle; P=053; n.s is not significant). Error bars represent SEM. c,d, Original trajectories of two place-conditioning sessions in 3D environment using wireless optogenetics for two mice (trajectories for Day 2 in (d) are the same as the one shown in Fig 5.c, The colored spheres (red and green) show the conditioned places. e,f, Original trajectories of two place conditioning sessions for two mice in the open-field arena using tethered optogenetics. The colored circles (red and green) show the conditioned locations. The time to reach the criteria condition (50th entrance to the conditioned place) is noted on the right.
Extended Data Fig. 8 Automatically reinforcing postures using wireless and tethered optogenetic VTA stimulation.
a, A arena covered with branches for mice with an experimenter-defined area where rearing was conditioned (red sphere). 3D trajectories of an additional mouse during a session with optogenetic reinforcement of an automatically detected behavior syllable (rearing, red arrows). b, Schematic representation of the information flow during optogenetic VTA DA neurons stimulation in 2D open-field arena. The steps are identical to Fig. 6a, except that the detection of a behavioral event triggers tethered optogenetic stimulation of VTA DA neurons. c, 3D trajectory of three mice during the session of reinforcement of an automatically detected behavior syllable (rearing) using tethered optogenetics in 2D environment. The color code indicates the height of the mouse. The time to reach the criteria condition (100th rearing) is noted on the right.
Extended Data Fig. 9 Additional CA1 neurons with spatially restricted activity.
a, CA1 neuron showing increased activity preferentially on a single branch. b, CA1 neuron from different session (but same animal) showing increased activity near a feeder (gray square). c, CA1 neuron from the same session as (b) but showing increased activity at two different locations.
Extended Data Fig. 10 Detailed analysis of a directional cell (Fig. 6).
This neuron showed the highest activity when moving on a thin branch towards the north of the maze. Each dot represents the average spike rate during a 13cm segment. Kruskal-Wallis tests; main effect P=1.110-14for thin (red) and P=0.007 for thick (blue) branches; Wilcoxon rank sum test for multiple comparison; Holm adjustment; red; N=32, N=35 and N=46 for North, South and Rest; North versus South; ***P < 10−8; North versus Rest; ***P < 10−8; South versus Rest; **P=0.0018; blue; N=27, N=23 and N=84 for North, South and Rest; North versus South; **P=0.006; North versus Rest; *P=0.03; South versus Rest; P=0.14; n.s not significant;). Bars represent the standard error of mean.
Supplementary information
Supplementary Information
Supplementary Figs. 1–7 and Tables 1–4
Supplementary Video 1
Tracking a single mouse in 3D. Position in 3D (left) and infrared close-up view (right) of a mouse being tracked during foraging in a naturalistic environment covered with branches using the EthoLoop system with passive retro-reflective marker.
Supplementary Video 2
Tracking multiple mouse lemurs in naturalistic conditions. Overview of the naturalistic maze with tree branches (left), positions in 3D of two mouse lemurs during social interactions (center, red, blue) tracked using the EthoLoop system with infrared LEDs of different wavelength as markers. Close-up view (right) of one of the mouse lemurs (blue) being tracked.
Supplementary Video 3
Close-up video of a mouse lemur during a jump. Close-up view of a mouse lemur while performing multiple jumps (slowed down 10×). This highlights the ability of the close-up system to follow the position of the tracked animal while keeping it at the center of the image.
Supplementary Video 4
Tracking a flying drone in 3D. Overview of a drone flying in a large-scale 100-m3 arena (left), position in 3D (middle) and infrared close-up view (right) using the EthoLoop system (passive retro-reflective marker). The overview shows the back of the gimbal-mounted autofocus close-up camera and a parallel close-up illumination source. The bright light of the ring-lights mounted on one of the four tracking cameras is visible at the top left corner of the arena.
Supplementary Video 5
Mouse lemur foraging in a lattice maze. Position (left) and close-up view of a mouse lemur foraging in a 3D lattice maze. The currently activated RECO-box is blinking and the path toward the next box is colored by the color of the currently active RECO-box. The mouse lemur carries an infrared LED and miniature battery pack around the neck.
Supplementary Video 6
Tracking of a mouse lemur in two different light condition. First half, close-up images of a mouse lemur foraging in the 3D lattice maze in the presence of dim light at high speed. Second half, close-up images of a mouse lemur foraging in complete darkness. The overall speed is significantly slower and the number of jumps is reduced.
Supplementary Video 7
Behavioral conditioning of a mouse lemur with the EthoLoop system. Overview (left), 3D position (center) and close-up view (right) of a mouse lemur during behavioral conditioning using the EthoLoop system. The dim blue sphere in the center view delineates the area where rearing on the hind limbs is reinforced by a sound followed by reward availability at a distantly located RECO-box. Rearing is detected on-the-fly in the close-up view, by the analysis of the geometrical arrangement of the automatically annotated limb positions.
Supplementary Video 8
Behavioral conditioning of a mouse (clicker training). 3D position (left) and close-up view (right) of a mouse during behavioral conditioning in a naturalistic 3D environment using the EthoLoop system. Rearing on the hind limbs on an unmarked branch is reinforced by a sound followed by reward availability at a distantly located RECO-box.
Supplementary Video 9
Behavioral conditioning of a mouse using wireless optogenetic stimulation in a naturalistic environment. 3D position (left) and close-up view (right) of a mouse during behavioral conditioning in the naturalistic environment using wireless optogenetic stimulation. Rearing on the hind limbs is followed by brief stimulation of VTA DA neurons wirelessly through a portable battery-powered stimulator. The rearing posture is detected on-the-fly in the close-up view, by the analysis of the geometrical arrangement of the automatically annotated limb positions.
Source data
Source Data Fig. 1
Data to produce b and c
Source Data Fig. 3
Data to produce d and h
Source Data Fig. 4
Data to produce e and i
Source Data Fig. 5
Data to produce e and i
Source Data Fig. 6
Data to produce b and d
Source Data Extended Data Fig. 1
Data to produce g and h
Source Data Extended Data Fig. 2
Data to produce b
Source Data Extended Data Fig. 3
Data to produce b
Source Data Extended Data Fig. 6
Data to produce b
Source Data Extended Data Fig. 7
Data to produce b
Source Data Extended Data Fig. 10
Data to produce the figure
Rights and permissions
About this article
Cite this article
Nourizonoz, A., Zimmermann, R., Ho, C.L.A. et al. EthoLoop: automated closed-loop neuroethology in naturalistic environments. Nat Methods 17, 1052–1059 (2020). https://doi.org/10.1038/s41592-020-0961-2
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1038/s41592-020-0961-2
This article is cited by
-
Spontaneous behaviour is structured by reinforcement without explicit reward
Nature (2023)
-
A paradigm shift in translational psychiatry through rodent neuroethology
Molecular Psychiatry (2023)
-
Neural cognitive signals during spontaneous movements in the macaque
Nature Neuroscience (2023)
-
Neural Correlates of Spatial Navigation in Primate Hippocampus
Neuroscience Bulletin (2023)
-
Driving singing behaviour in songbirds using a multi-modal, multi-agent virtual environment
Scientific Reports (2022)