Abstract
As we move around, relevant information that disappears from sight can still be held in working memory to serve upcoming behaviour. How we maintain and select visual information as we move through the environment remains poorly understood because most laboratory tasks of working memory rely on removing visual material while participants remain still. We used virtual reality to study visual working memory following self-movement in immersive environments. Directional biases in gaze revealed the recruitment of more than one spatial frame for maintaining and selecting memoranda following self-movement. The findings bring the important realization that multiple spatial frames support working memory in natural behaviour. The results also illustrate how virtual reality can be a critical experimental tool to characterize this core memory system.
This is a preview of subscription content, access via your institution
Access options
Access Nature and 54 other Nature Portfolio journals
Get Nature+, our best-value online-access subscription
$29.99 / 30 days
cancel any time
Subscribe to this journal
Receive 12 digital issues and online access to articles
$119.00 per year
only $9.92 per issue
Buy this article
- Purchase on SpringerLink
- Instant access to full article PDF
Prices may be subject to local taxes which are calculated during checkout
Similar content being viewed by others
Data availability
All data are publicly available at https://osf.io/cj97y/.
Code availability
Code is available at https://osf.io/cj97y/.
References
Baddeley, A. Working memory. Science 255, 556–559 (1992).
Ballard, D. H., Hayhoe, M. M., Pook, P. K. & Rao, R. P. N. Deictic codes for the embodiment of cognition. Behav. Brain Sci. 20, 723–767 (1997).
Tatler, B. W. & Land, M. F. Vision and the representation of the surroundings in spatial memory. Philos. Trans. R. Soc. Lond. B 366, 596–610 (2011).
Hayhoe, M. & Ballard, D. Eye movements in natural behavior. Trends Cogn. Sci. 9, 188–194 (2005).
Draschkow, D., Kallmayer, M. & Nobre, A. C. When natural behavior engages working memory. Curr. Biol. 31, 869–874.e5 (2021).
Kristjánsson, Á. & Draschkow, D. Keeping it real: looking beyond capacity limits in visual cognition. Atten. Percept. Psychophys. 83, 1375–1390 (2021).
Võ, M. L.-H., Boettcher, S. E. P. & Draschkow, D. Reading scenes: how scene grammar guides attention and aids perception in real-world environments. Curr. Opin. Psychol. 29, 205–210 (2019).
Fuster, J. M. & Alexander, G. E. Neuron activity related to short-term memory. Science 173, 652–654 (1971).
Funahashi, S., Chafee, M. V. & Goldman-Rakic, P. S. Prefrontal neuronal activity in rhesus monkeys performing a delayed anti-saccade task. Nature 365, 753–756 (1993).
Awh, E. & Jonides, J. Overlapping mechanisms of attention and spatial working memory. Trends Cogn. Sci. 5, 119–126 (2001).
Vogel, E. K. & Machizawa, M. G. Neural activity predicts individual differences in visual working memory capacity. Nature 428, 748–751 (2004).
Bays, P. M. & Husain, M. Dynamic shifts of limited working memory resources in human vision. Science 321, 851–854 (2008).
Griffin, I. C. & Nobre, A. C. Orienting attention to locations in internal representations. J. Cogn. Neurosci. 15, 1176–1194 (2003).
van Ede, F., Chekroud, S. R., Stokes, M. G. & Nobre, A. C. Concurrent visual and motor selection during visual working memory guided action. Nat. Neurosci. 22, 477–483 (2019).
D’Esposito, M. & Postle, B. R. The cognitive neuroscience of working memory. Annu. Rev. Psychol. 66, 115–142 (2015).
Burgess, N. Spatial memory: how egocentric and allocentric combine. Trends Cogn. Sci. 10, 551–557 (2006).
Wolbers, T., Hegarty, M., Büchel, C. & Loomis, J. M. Spatial updating: how the brain keeps track of changing object locations during observer motion. Nat. Neurosci. 11, 1223–1230 (2008).
Aagten-Murphy, D. & Bays, P. M. Independent working memory resources for egocentric and allocentric spatial information. PLoS Comput. Biol. 5, e1006563 (2019).
Klinghammer, M., Schütz, I., Blohm, G. & Fiehler, K. Allocentric information is used for memory-guided reaching in depth: a virtual reality study. Vision Res. 129, 13–24 (2016).
Boon, P. J., Theeuwes, J. & Belopolsky, A. V. Updating visual–spatial working memory during object movement. Vision Res. 94, 51–57 (2014).
Bellmund, J. L. S., Gärdenfors, P., Moser, E. I. & Doeller, C. F. Navigating cognition: spatial codes for human thinking. Science 362, eaat6766 (2018).
Meister, M. L. R. & Buffalo, E. A. Neurons in primate entorhinal cortex represent gaze position in multiple spatial reference frames. J. Neurosci. 38, 2430–2441 (2018).
Klatzky, R. L. in Spatial Cognition Vol. 1404 (eds Freska, C. et al.) 1–17 (Springer, 1998).
Postle, B. R. & D’Esposito, M. Spatial working memory activity of the caudate nucleus is sensitive to frame of reference. Cogn. Affect. Behav. Neurosci. 3, 133–144 (2003).
Shelton, A. L. & McNamara, T. P. Systems of spatial reference in human memory. Cogn. Psychol. 43, 274–310 (2001).
McNamara, T. P., Rump, B. & Werner, S. Egocentric and geocentric frames of reference in memory of large-scale space. Psychon. Bull. Rev. 10, 589–595 (2003).
McNamara, T. P. Mental representations of spatial relations. Cogn. Psychol. 18, 87–121 (1986).
Melcher, D. & Colby, C. L. Trans-saccadic perception. Trends Cogn. Sci. 12, 466–473 (2008).
Nakamura, K. & Colby, C. L. Updating of the visual representation in monkey striate and extrastriate cortex during saccades. Proc. Natl Acad. Sci. USA 99, 4026–4031 (2002).
Aagten-Murphy, D. & Bays, P. M. in Processes of Visuospatial Attention and Working Memory. Current Topics in Behavioral Neurosciences Vol. 41 (ed. Hodgson, T.) 155–183 (Springer, 2019).
Van der Stigchel, S. & Hollingworth, A. Visuospatial working memory as a fundamental component of the eye movement system. Curr. Dir. Psychol. Sci. 27, 136–143 (2018).
Colby, C. L. Action-oriented spatial reference frames in cortex. Neuron 20, 15–24 (1998).
van Ede, F., Chekroud, S. R. & Nobre, A. C. Human gaze tracks attentional focusing in memorized visual space. Nat. Human Behav. 3, 462–470 (2019).
van Ede, F., Board, A. G. & Nobre, A. C. Goal-directed and stimulus-driven selection of internal representations. Proc. Natl Acad. Sci. USA 117, 24590–24598 (2020).
Maris, E. & Oostenveld, R. Nonparametric statistical testing of EEG- and MEG-data. J. Neurosci. Methods 164, 177–190 (2007).
Spivey, M. J. & Geng, J. J. Oculomotor mechanisms activated by imagery and memory: eye movements to absent objects. Psychol. Res. 65, 235–241 (2001).
Ferreira, F., Apel, J. & Henderson, J. M. Taking a new look at looking at nothing. Trends Cogn. Sci. 12, 405–410 (2008).
Johansson, R. & Johansson, M. Look here, eye movements play a functional role in memory retrieval. Psychol. Sci. 25, 236–242 (2014).
Engbert, R. & Kliegl, R. Microsaccades uncover the orientation of covert attention. Vision Res. 43, 1035–1045 (2003).
Hafed, Z. M. & Clark, J. J. Microsaccades as an overt measure of covert attention shifts. Vision Res. 42, 2533–2545 (2002).
Corneil, B. D. & Munoz, D. P. Overt responses during covert orienting. Neuron 82, 1230–1243 (2014).
Ballard, D., Hayhoe, M. M. & Pelz, J. B. Memory representations in natural tasks. J. Cogn. Neurosci. 7, 66–80 (1995).
Saredakis, D. et al. Factors associated with virtual reality sickness in head-mounted displays: a systematic review and meta-analysis. Front. Hum. Neurosci. 14, 96 (2020).
Mou, W. & McNamara, T. P. Intrinsic frames of reference in spatial memory. J. Exp. Psychol. Learn. Mem. Cogn. 28, 162–170 (2002).
Mou, W., McNamara, T. P., Rump, B. & Xiao, C. Roles of egocentric and allocentric spatial representations in locomotion and reorientation. J. Exp. Psychol. Learn. Mem. Cogn. 32, 1274–1290 (2006).
Tatler, B. W. et al. Priorities for selection and representation in natural tasks. Philos. Trans. R. Soc. Lond. B 368, 20130066 (2013).
Scarfe, P. & Glennerster, A. The science behind virtual reality displays. Annu. Rev. Vis. Sci. 5, 529–547 (2019).
Iglói, K., Doeller, C. F., Berthoz, A., Rondi-Reig, L. & Burgess, N. Lateralized human hippocampal activity predicts navigation based on sequence or place memory. Proc. Natl Acad. Sci. USA 107, 14466–14471 (2010).
Meilinger, T., Knauff, M. & Bulthoff, H. H. Working memory in wayfinding – a dual task experiment in a virtual city. Cogn. Sci. 32, 755–770 (2008).
Draschkow, D. & Võ, M. L.-H. Scene grammar shapes the way we interact with objects, strengthens memories, and speeds search. Sci. Rep. 7, 16471 (2017).
Li, C. L., Aivar, M. P., Kit, D. M., Tong, M. H. & Hayhoe, M. M. Memory and visual search in naturalistic 2D and 3D environments. J. Vis. 16, 9 (2016).
Helbing, J., Draschkow, D. & Võ, M. L.-H. Search superiority: goal-directed attentional allocation creates more reliable incidental identity and location memory than explicit encoding in naturalistic virtual environments. Cognition 196, 104147 (2020).
Tarr, M. J. & Warren, W. H. Virtual reality in behavioral neuroscience and beyond. Nat. Neurosci. 5, 1089–1092 (2002).
Regan, C. An investigation into nausea and other side-effects of head-coupled immersive virtual reality. Virtual Real. 1, 17–31 (1995).
Lo, W. T. & So, R. H. Y. Cybersickness in the presence of scene rotational movements along different axes. Appl. Ergon. 32, 1–14 (2001).
R Core Team. R: A Language and Environment for Statistical Computing (R Foundation for Statistical Computing, 2017).
Frossard, J. & Renaud, O. Permutation tests for regression, ANOVA, and comparison of signals: the permuco package. J. Stat. Softw. 99, 1–32 (2021).
Rouder, J. N., Speckman, P. L., Sun, D., Morey, R. D. & Iverson, G. Bayesian t tests for accepting and rejecting the null hypothesis. Psychon. Bull. Rev. 16, 225–237 (2009).
Morey, R. D. & Rouder, J. N. BayesFactor: Computation of Bayes Factors for Common Designs. R version 0.9.12-4.2 https://CRAN.R-project.org/package=BayesFactor (2018).
Kass, R. E. & Raftery, A. E. Bayes factors. J. Am. Stat. Assoc. 90, 773–795 (1995).
Schönbrodt, F. D. & Wagenmakers, E. J. Bayes factor design analysis: planning for compelling evidence. Psychon. Bull. Rev. 25, 128–142 (2018).
Acknowledgements
This research was funded by a Wellcome Trust Senior Investigator Award (104571/Z/14/Z) and a James S. McDonnell Foundation Understanding Human Cognition Collaborative Award (220020448) to A.C.N., a Marie Skłodowska-Curie Fellowship from the European Commission (ACCESS2WM) and an ERC Starting Grant from the European Research Council (MEMTICIPATION, 850636) to F.v.E., and by the National Institutes of Health Research Oxford Health Biomedical Research Centre. The Wellcome Centre for Integrative Neuroimaging is supported by core funding from the Wellcome Trust (203139/Z/16/Z). The funders had no role in study design, data collection and analysis, decision to publish or preparation of the manuscript. For the purpose of open access, the author has applied a CC BY public copyright license to any Author Accepted Manuscript version arising from this submission.
Author information
Authors and Affiliations
Contributions
D.D., A.C.N. and F.v.E. conceived and designed the experiments. D.D. programmed the experiments and acquired the data. D.D. and F.v.E. analysed the data. D.D., A.C.N. and F.v.E. interpreted the data. D.D., A.C.N. and F.v.E. drafted and revised the manuscript.
Corresponding author
Ethics declarations
Competing interests
The authors declare no competing interests.
Peer review
Peer review information
Nature Human Behaviour thanks Joy Geng, Andrey Nikolaev and the other, anonymous, reviewer(s) for their contribution to the peer review of this work. Peer reviewer reports are available.
Additional information
Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Supplementary information
Supplementary Information
Supplementary Table 1 and Figs. 1–4.
Rights and permissions
About this article
Cite this article
Draschkow, D., Nobre, A.C. & van Ede, F. Multiple spatial frames for immersive working memory. Nat Hum Behav 6, 536–544 (2022). https://doi.org/10.1038/s41562-021-01245-y
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1038/s41562-021-01245-y
This article is cited by
-
Active visual search in naturalistic environments reflects individual differences in classic visual search performance
Scientific Reports (2023)
-
No obligatory trade-off between the use of space and time for working memory
Communications Psychology (2023)
-
Congruence-based contextual plausibility modulates cortical activity during vibrotactile perception in virtual multisensory environments
Communications Biology (2022)
-
Spatial coding for action across spatial scales
Nature Reviews Psychology (2022)