Context-dependent representations of objects and space in the primate hippocampus during virtual navigation

Abstract

The hippocampus is implicated in associative memory and spatial navigation. To investigate how these functions are mixed in the hippocampus, we recorded from single hippocampal neurons in macaque monkeys navigating a virtual maze during a foraging task and a context–object associative memory task. During both tasks, single neurons encoded information about spatial position; a linear classifier also decoded position. However, the population code for space did not generalize across tasks, particularly where stimuli relevant to the associative memory task appeared. Single-neuron and population-level analyses revealed that cross-task changes were due to selectivity for nonspatial features of the associative memory task when they were visually available (perceptual coding) and following their disappearance (mnemonic coding). Our results show that neurons in the primate hippocampus nonlinearly mix information about space and nonspatial elements of the environment in a task-dependent manner; this efficient code flexibly represents unique perceptual experiences and correspondent memories.

Access options

Rent or Buy article

Get time limited or full article access on ReadCube.

from$8.99

All prices are NET prices.

Fig. 1: Behavioral tasks and individual neuron SIC.
Fig. 2: Single-neuron spatial response fields in each task.
Fig. 3: Hippocampal ensemble prediction of spatial position with a linear classifier.
Fig. 4: Nonspatial feature selectivity in the associative memory task.
Fig. 5: Perceptual versus memory encoding of trial features.
Fig. 6: Trial type decoding in the associative memory task.

Data availability

Data can be downloaded at https://robertogulli.com/data. Further information and requests for resources and protocols should be directed to and will be fulfilled by the lead contact, R.A.G.

Code availability

The code used in the study is available upon request from R.A.G.

References

  1. 1.

    Buzsáki, G. & Moser, E. I. Memory, navigation and theta rhythm in the hippocampal-entorhinal system. Nat. Neurosci. 16, 130–138 (2013).

  2. 2.

    Nadel, L. The hippocampus and space revisited. Hippocampus 1, 221–229 (1991).

  3. 3.

    O’Keefe, J. & Nadel, L. The Hippocampus As a Cognitive Map (Clarendon Press, 1978).

  4. 4.

    Eichenbaum, H. The role of the hippocampus in navigation is memory. J. Neurophysiol. 117, 1785–1796 (2017).

  5. 5.

    Eichenbaum, H. & Cohen, N. J. Can we reconcile the declarative memory and spatial navigation views on hippocampal function? Neuron 83, 764–770 (2014).

  6. 6.

    Ekstrom, A. D. & Ranganath, C. Space, time, and episodic memory: the hippocampus is all over the cognitive map. Hippocampus 28, 680–687 (2018).

  7. 7.

    Schiller, D. et al. Memory and space: towards an understanding of the cognitive map. J. Neurosci. 35, 13904–13911 (2015).

  8. 8.

    Preuss, T. M. Taking the measure of diversity: comparative alternatives to the model-animal paradigm in cortical neuroscience. Brain Behav. Evol. 55, 287–299 (2000).

  9. 9.

    Rolls, E. T. & Wirth, S. Spatial representations in the primate hippocampus, and their functions in memory and navigation. Prog. Neurobiol. 171, 90–113 (2018).

  10. 10.

    Squire, L. R. Memory and the hippocampus: a synthesis from findings with rats, monkeys, and humans. Psychol. Rev. 99, 195–231 (1992).

  11. 11.

    Jutras, M. J. & Buffalo, E. A. Recognition memory signals in the macaque hippocampus. Proc. Natl Acad. Sci. USA 107, 401–406 (2010).

  12. 12.

    Rolls, E. T. et al. Hippocampal neurons in the monkey with activity related to the place in which a stimulus is shown. J. Neurosci. 9, 1835–1845 (1989).

  13. 13.

    Fried, I., MacDonald, K. A. & Wilson, C. L. Single neuron activity in human hippocampus and amygdala during recognition of faces and objects. Neuron 18, 753–765 (1997).

  14. 14.

    Suthana, N. A. et al. Specific responses of human hippocampal neurons are associated with better memory. Proc. Natl Acad. Sci. USA 112, 10503–10508 (2015).

  15. 15.

    Wirth, S. et al. Single neurons in the monkey hippocampus and learning of new associations. Science 300, 1578–1581 (2003).

  16. 16.

    Rolls, E. T. & Xiang, J.-Z. Reward-spatial view representations and learning in the primate hippocampus. J. Neurosci. 25, 6167–6174 (2005).

  17. 17.

    Wirth, S. et al. Trial outcome and associative learning signals in the monkey hippocampus. Neuron 61, 930–940 (2009).

  18. 18.

    O’Keefe, J. & Dostrovsky, J. The hippocampus as a spatial map. Preliminary evidence from unit activity in the freely-moving rat. Brain Res. 34, 171–175 (1971).

  19. 19.

    Moser, E. I., Moser, M.-B. & McNaughton, B. L. Spatial representation in the hippocampal formation: a history. Nat. Neurosci. 20, 1448–1464 (2017).

  20. 20.

    Rolls, E. T. & Xiang, J. Z. Spatial view cells in the primate hippocampus and memory recall. Rev. Neurosci. 17, 175–200 (2006).

  21. 21.

    Ekstrom, A. D. et al. Cellular networks underlying human spatial navigation. Nature 425, 184–188 (2003).

  22. 22.

    Miller, J. F. et al. Neural activity in human hippocampal formation reveals the spatial context of retrieved memories. Science 342, 1111–1114 (2013).

  23. 23.

    Hori, E. et al. Place-related neural responses in the monkey hippocampal formation in a virtual space. Hippocampus 15, 991–996 (2005).

  24. 24.

    Baraduc, P., Duhamel, J. R. & Wirth, S. Schema cells in the macaque hippocampus. Science 363, 635–639 (2019).

  25. 25.

    Wirth, S., Baraduc, P., Planté, A., Pinède, S. & Duhamel, J.-R. Gaze-informed, task-situated representation of space in primate hippocampus during virtual navigation. PLoS Biol. 15, e2001045 (2017).

  26. 26.

    Averbeck, B. B. & Lee, D. Effects of noise correlations on information encoding and decoding. J. Neurophysiol. 95, 3633–3644 (2006).

  27. 27.

    Rigotti, M. et al. The importance of mixed selectivity in complex cognitive tasks. Nature 497, 585–590 (2013).

  28. 28.

    Fusi, S., Miller, E. K. & Rigotti, M. Why neurons mix: high dimensionality for higher cognition. Curr. Opin. Neurobiol. 37, 66–74 (2016).

  29. 29.

    Doucet, G., Gulli, R. A. & Martinez-Trujillo, J. C. Cross-species 3D virtual reality toolbox for visual and cognitive experiments. J. Neurosci. Methods 266, 84–93 (2016).

  30. 30.

    Markus, E. J., Barnes, C. A., McNaughton, B. L., Gladden, V. L. & Skaggs, W. E. Spatial information content and reliability of hippocampal CA1 neurons: effects of visual input. Hippocampus 4, 410–421 (1994).

  31. 31.

    Skaggs, W. E., McNaughton, B. L., Gothard, K. M. & Markus, E. J. An information-theoretic approach to deciphering the hippocampal code. In Proc. Advances in Neural Information Processing Systems 5 (eds Hanson, S. J. et al.) 1030–1037 (Morgan Kaufmann Publishers, 1993).

  32. 32.

    Treves, A. & Panzeri, S. The upward bias in measures of information derived from limited data samples. Neural Comput. 7, 399–407 (1995).

  33. 33.

    Acharya, L., Aghajan, Z. M., Vuong, C., Moore, J. J. & Mehta, M. R. Causal influence of visual cues on hippocampal directional selectivity. Cell 164, 197–207 (2016).

  34. 34.

    Killian, N. J., Jutras, M. J. & Buffalo, E. A. A map of visual space in the primate entorhinal cortex. Nature 491, 761–764 (2012).

  35. 35.

    Hartley, T., Lever, C., Burgess, N. & O’Keefe, J. Space in the brain: how the hippocampal formation supports spatial cognition. Philos. Trans. R. Soc. Lond. B 369, 20120510 (2014).

  36. 36.

    Matsumura, N. et al. Spatial- and task-dependent neuronal responses during real and virtual translocation in the monkey hippocampal formation. J. Neurosci. 19, 2381–2393 (1999).

  37. 37.

    Ono, T., Nakamura, K., Nishijo, H. & Eifuku, S. Monkey hippocampal neurons related to spatial and nonspatial functions. J. Neurophysiol. 70, 1516–1529 (1993).

  38. 38.

    Gauthier, J. L. & Tank, D. W. A dedicated population for reward coding in the hippocampus. Neuron 99, 179–193.e7 (2018).

  39. 39.

    Morris, R. G. M. & Frey, U. Hippocampal synaptic plasticity: role in spatial learning or the automatic recording of attended experience? Philos. Trans. R. Soc. Lond. B 352, 1489–1503 (1997).

  40. 40.

    Yonelinas, A. P. The hippocampus supports high-resolution binding in the service of perception, working memory and long-term memory. Behav. Brain Res. 254, 34–44 (2013).

  41. 41.

    Kraus, B. J., Robinson, R. J.2nd, White, J. A., Eichenbaum, H. & Hasselmo, M. E. Hippocampal ‘time cells’: time versus path integration. Neuron 78, 1090–1101 (2013).

  42. 42.

    Aronov, D., Nevers, R. & Tank, D. W. Mapping of a non-spatial dimension by the hippocampal-entorhinal circuit. Nature 543, 719–722 (2017).

  43. 43.

    Colombo, M., Fernandez, T., Nakamura, K. & Gross, C. G. Functional differentiation along the anterior-posterior axis of the hippocampus in monkeys. J. Neurophysiol. 80, 1002–1005 (1998).

  44. 44.

    Sliwa, J., Planté, A., Duhamel, J.-R. & Wirth, S. Independent neuronal representation of facial and vocal identity in the monkey hippocampus and inferotemporal cortex. Cereb. Cortex 26, 950–966 (2016).

  45. 45.

    Ison, M. J., Quian Quiroga, R. & Fried, I. Rapid encoding of new memories by individual neurons in the human brain. Neuron 87, 220–230 (2015).

  46. 46.

    Johnston, W. J., Palmer, S. E. & Freedman, D. J. Nonlinear mixed selectivity supports reliable neural computation. Preprint at bioRxiv https://www.biorxiv.org/content/biorxiv/early/2019/03/14/577288.full.pdf (2019).

  47. 47.

    Olshausen, B. A. & Field, D. J. Sparse coding with an overcomplete basis set: a strategy employed by V1? Vision Res. 37, 3311–3325 (1997).

  48. 48.

    Marr, D. Simple memory: a theory for archicortex. Philos. Trans. R. Soc. Lond. B 262, 23–81 (1971).

  49. 49.

    Tank, D. W. & Hopfield, J. J. Collective computation in neuronlike circuits. Sci. Am. 257, 104–114 (1987).

  50. 50.

    Benna, M. K. & Fusi, S. Are place cells just memory cells? Memory compression leads to spatial tuning and history dependence. Preprint at bioRxiv https://www.biorxiv.org/content/biorxiv/early/2019/04/30/624239.full.pdf (2019).

  51. 51.

    Corrigan, B. W., Gulli, R. A., Doucet, G. & Martinez-Trujillo, J. C. Characterizing eye movement behaviors and kinematics of non-human primates during virtual navigation tasks. J. Vis. 17, 15 (2017).

  52. 52.

    Smith, A. C. et al. Dynamic analysis of learning in behavioral experiments. J. Neurosci. 24, 447–461 (2004).

  53. 53.

    Ravassard, P. et al. Multisensory control of hippocampal spatiotemporal selectivity. Science 340, 1342–1346 (2013).

  54. 54.

    Chen, G., King, J. A., Burgess, N. & O’Keefe, J. How vision and movement combine in the hippocampal place code. Proc. Natl Acad. Sci. USA 110, 378–383 (2013).

  55. 55.

    Fan, R.-E., Chang, K.-W., Hsieh, C.-J., Wang, X.-R. & Lin, C.-J. LIBLINEAR: a library for large linear classification. J. Mach. Learn. Res. 9, 1871–1874 (2008).

  56. 56.

    Friedman, J., Hastie, T. & Tibshirani, R. Regularization paths for generalized linear models via coordinate descent. J. Stat. Softw. 33, 1–22 (2010).

  57. 57.

    Cohen, J. A coefficient of agreement for nominal scales. Educ. Psychol. Meas. 20, 37–46 (1960).

Download references

Acknowledgements

We thank J. Jackson, M. Leavitt, R. Nogueira for critical editing, input and discussion, and all members of the JMT laboratory for support. We thank B. Bally, K. Barker, J. Blonde, S. Chisling, J. Diedrichsen, S. Frey, S. Nuara and W. Kucharski for technical assistance. R.A.G. was supported by a Natural Sciences and Engineering Research Council of Canada (NSERC) Postgraduate Scholarship-Doctoral Fellowship and a McGill David G. Guthrie Fellowship. This work was further supported by Canadian Institutes of Health Research (CIHR) and NSERC grants to J.M.-T., funding from NeuroNex (no. DBI-1707398) to S.F. and funding from Healthy Brains for Healthy Lives and CIHR to S.W.

Author information

R.A.G. designed the experiments and virtual environments, collected and analyzed the data, and wrote the manuscript. L.R.D. contributed to the data analysis. B.W.C. contributed to data collection and data analysis. G.D. contributed to virtual environment design and data analysis. S.W. contributed to the experimental design. S.F. contributed to data analysis and manuscript writing. J.M.-T. contributed to the experimental design and manuscript writing.

Correspondence to Roberto A. Gulli.

Ethics declarations

Competing interests

The authors declare no competing interests.

Additional information

Peer review information Nature Neuroscience thanks Arne Ekstrom and the other, anonymous, reviewer(s) for their contribution to the peer review of this work.

Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Extended data

Extended Data Fig. 1 Hippocampal recordings: planning, mapping and verification.

Schematic representation of the major steps in planning, mapping, and verification of electrode trajectories and recording sites. In all cases, scale bars represent 10 mm. 1. Prior to any surgical procedures, a 3T MRI was taken of the naiive monkey. 2. Using Brainsight (Rogue Research, Montreal, Canada), the chamber trajectory was planned. The skull was then 3D-printed, and a mock surgical procedure was done to recreate the chamber trajectory. A custom-made footed chamber was then formed to the skull at its intended placement with the intended trajectory. 3. A titanium (monkey R) or silex (monkey W) chamber was implanted along the planned trajectory. Subsequently, a post-implant computed tomography scan was taken with the recording grid and electrodes in place in order to visualize the electrode trajectories. 4. The computed tomography scan was co-registered to the naiive MRI. 5. The updated trajectory of each grid hole was mapped. At each grid hole used for recording, expected depths the cortical surface, grey matter/white matter transitions, and ultimately the hippocampal region of interest were mapped prior to recording. 6. During each recording session the previously mapped values were monitored during electrode guidance towards the hippocampal region of interest. 7. In monkey R, electrode tracts were visible in a post-experimental 7T MRI acquisition. This procedure was not possible for monkey W.

Extended Data Fig. 2 Individual neuron characteristics and example neurons.

A) Burst fraction, spike width, and firing rate of all recorded neurons (n = 183). Light grey circle; example neuron W0325.A1M0.2 Dark grey circle; example neuron R0910.Hc7.3. B) Example neuron W0325.A1M0.2 inter-spike-interval distribution and average waveform. Shaded area, SEM. Below, spike raster as a function of time in the experimental recording session. C) Example neuron R0910.Hc7.3 inter-spike-interval distribution and average waveform. Shaded area, SEM. Below, spike raster as a function of time in the experimental recording session.

Extended Data Fig. 3 Example associative memory task reward hierarchy.

A) Example of the reversed two-context, three-object reward value hierarchy for recording session W0325. B) Two example trials of the associative memory task from the recording session. Subject trajectories through the maze are colored according to the time from trial start (color bar). White arrow indicates the object of higher reward value. C) Representative first-person-view of the monkeys during each trial at position b. White arrow indicates the object of higher reward value. D) Estimated learning state averaged for the high-low value context-dependent association across all sessions, and 95% confidence interval of this estimate (n=37 sessions).

Extended Data Fig. 4 Example neurons, smoothed firing rate maps.

Smoothed firing rate maps of the six example neurons seen in Fig. 1d. Pixel-wise firing rates were smoothed with a 3-bin Gaussian kernel. Color maps are consistent within neuron and across tasks, with the maximum and minimum firing rates denoted separately for each neuron.

Extended Data Fig. 5 Spatial response fields when computed with larger pixel sizes.

Conventions are the same as in Fig. 2. However, all visualizations and statistics have been done with pixels that are 4 times larger. (A) Spatial histogram showing the number of neurons with statistically elevated firing rate in each pixel in both tasks (top). The summarized histogram (bottom) shows the number of neurons with at least one significant pixel in each maze area. *significantly different proportion across tasks; McNemar’s test of equal proportions, p<0.05, Bonferroni-corrected. (B) Locations of coincident place fields for all neurons with more than one place field in each task. (C) Location of coincident place fields for all neurons with at least one place field in each task.

Extended Data Fig. 6 Cross-task decoding accuracy in each area of the X-Maze.

A) Cross-task decoding accuracy (orange) and decoding accuracy when the maze area labels were shuffled (also seen in Fig. 2b). Grey bars, mean. B) Confusion matrix derived from the cross-task decoding analysis (also seen in Fig. 2c). White numbers indicate the mean decoding accuracy within each maze area. C) Cross-task decoding accuracy in each maze area (colored lines) alongside the chance decoding accuracy distribution (shuffled control, grey).

Extended Data Fig. 7 Neuronal activity across trial epochs of the associative memory task.

A) Overhead view of the X-Maze and the subject’s trajectory through the maze on two consecutive trials. Each trial contains five distinct trial epochs. During the Post-reward and Pre-context epochs, all maze walls are grey and no rewarded objects are visible. Once the subject enters the central corridor, the context is cued using a wood or steel material applied to some of the maze walls. Once the subject leaves the corridor for the branched area of the maze, an object is made visible simultaneously in each arm of the maze. Subjects learn a reversed context-object reward value hierarchy by trial and error. B) Spike locations and firing rate by trial epoch for six example neurons during the associative memory task. Left: trajectories through the X-Maze (translucent grey) and spike locations (translucent red). Right: Box plot showing firing rate by trial epoch in the associative memory task. Dots indicate median value; lines indicate the 25th to 75th percentile; outliers are plotted individually. *, p<0.05 compared to the trial epoch with the lowest firing rate; Kruskal-Wallis, Bonferroni-corrected.

Extended Data Fig. 8 Rewarded-aligned spike rasters.

A) Rewarded locations in session R0910 during the Associative memory task (left) and Foraging task (right). B) Reward-aligned rasters for example neuron R0910.Hc7.3 in each task. Black ticks mark the times of action potentials on each trial. The red lines mark the reward delivery for each trial. C) Rewarded locations in session W0325 during the Associative memory task (left) and Foraging task (left). D) Reward-aligned rasters for example neuron W0325.A1M0.2 in each task.

Extended Data Fig. 9 Decoding trial type from an equal number of perceptual and mnemonic trial epochs.

Distribution of classification accuracies from decoding analysis of trial type (trial context and object pair) from perceptual (object appearance, object approach) or memory (post-reward, pre-context) trial epochs in the associative memory task. *p<0.05, two-sided Wilcoxon rank-sum, n=50 per distribution. Grey bars, mean.

Supplementary information

Reporting Summary

Supplementary Video 1

Four example trials of the Foraging task from session W0325. Top, LFP trace (white streaming line) and single unit action potentials (blue ticks, sound) recorded from unit W0325.A1M0.2. Circle and dot, monkey’s eye position in the virtual environment. Bottom right, time from trial start. Bottom left, name of every object that falls within 3 degrees of the foveated position.

Supplementary Video 2

Three example trials of the Foraging task from session W0325. Top, LFP trace (white streaming line) and single unit action potentials (blue ticks, sound) recorded from unit W0325.A1M0.2. Circle and dot, monkey’s eye position in the virtual environment. Bottom right, time from trial start. Bottom left, name of every object that falls within 3 degrees of the foveated position.

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Gulli, R.A., Duong, L.R., Corrigan, B.W. et al. Context-dependent representations of objects and space in the primate hippocampus during virtual navigation. Nat Neurosci 23, 103–112 (2020). https://doi.org/10.1038/s41593-019-0548-3

Download citation