Article | Published:

Concurrent visual and motor selection during visual working memory guided action

Nature Neuroscience (2019) | Download Citation

Abstract

Visual working memory enables us to hold onto past sensations in anticipation that these may become relevant for guiding future actions. Yet laboratory tasks have treated visual working memories in isolation from their prospective actions and have focused on the mechanisms of memory retention rather than utilization. To understand how visual memories become used for action, we linked individual memory items to particular actions and independently tracked the neural dynamics of visual and motor selection when memories became used for action. This revealed concurrent visual-motor selection, engaging appropriate visual and motor brain areas at the same time. Thus we show that items in visual working memory can invoke multiple, item-specific, action plans that can be accessed together with the visual representations that guide them, affording fast and precise memory-guided behavior.

Access optionsAccess options

Rent or Buy article

Get time limited or full article access on ReadCube.

from$8.99

All prices are NET prices.

Code availability

Code will be made available by the authors upon reasonable request.

Data availability

All data are publically available through the Dryad Digital Repository at: https://doi.org/10.5061/dryad.sk8rb66.

Additional information

Publisher’s note: Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

References

  1. 1.

    Baddeley, A. Working memory. Science 255, 556–559 (1992).

  2. 2.

    D’Esposito, M. & Postle, B. R. The cognitive neuroscience of working memory. Annu. Rev. Psychol. 66, 115–142 (2015).

  3. 3.

    Pasternak, T. & Greenlee, M. W. Working memory in primate sensory systems. Nat. Rev. Neurosci. 6, 97–107 (2005).

  4. 4.

    Harrison, S. A. & Tong, F. Decoding reveals the contents of visual working memory in early visual areas. Nature 458, 632–635 (2009).

  5. 5.

    Luck, S. J. & Vogel, E. K. The capacity of visual working memory for features and conjunctions. Nature 390, 279–281 (1997).

  6. 6.

    Schneegans, S. & Bays, P. M. Neural architecture for feature binding in visual working memory. J. Neurosci. 37, 3913–3925 (2017).

  7. 7.

    Snyder, L. H., Batista, A. P. & Andersen, R. A. Coding of intention in the posterior parietal cortex. Nature 386, 167–170 (1997).

  8. 8.

    Cisek, P. & Kalaska, J. F. Neural correlates of reaching decisions in dorsal premotor cortex: specification of multiple direction choices and final selection of action. Neuron 45, 801–814 (2005).

  9. 9.

    Li, N., Daie, K., Svoboda, K. & Druckmann, S. Robust neuronal dynamics in premotor cortex during motor planning. Nature 532, 459–464 (2016).

  10. 10.

    Rowe, J. B., Toni, I., Josephs, O., Frackowiak, R. S. J. & Passingham, R. E. The prefrontal cortex: response selection or maintenance within working memory? Science 288, 1656–1660 (2000).

  11. 11.

    Chatham, C. H., Frank, M. J. & Badre, D. Corticostriatal output gating during selection from working memory. Neuron 81, 930–942 (2014).

  12. 12.

    Worden, M. S., Foxe, J. J., Wang, N. & Simpson, G. V. Anticipatory biasing of visuospatial attention indexed by retinotopically specific alpha-band electroencephalography increases over occipital cortex. J. Neurosci. 20, RC63 (2000).

  13. 13.

    van Ede, F. Mnemonic and attentional roles for states of attenuated alpha oscillations in perceptual working memory: a review. Eur. J. Neurosci. 48, 2509–2515 (2018).

  14. 14.

    Waldhauser, G. T., Braun, V. & Hanslmayr, S. Episodic memory retrieval functionally relies on very rapid reactivation of sensory information. J. Neurosci. 36, 251–260 (2016).

  15. 15.

    Kuo, B.-C., Rao, A., Lepsien, J. & Nobre, A. C. Searching for targets within the spatial layout of visual short-term memory. J. Neurosci. 29, 8032–8038 (2009).

  16. 16.

    van Ede, F., Niklaus, M. & Nobre, A. C. Temporal expectations guide dynamic prioritization in visual working memory through attenuated α oscillations. J. Neurosci. 37, 437–445 (2017).

  17. 17.

    Foster, J. J., Bsales, E. M., Jaffe, R. J. & Awh, E. Alpha-band activity reveals spontaneous representations of spatial position in visual working memory. Curr. Biol. 27, 3216–3223.e6 (2017).

  18. 18.

    Salmelin, R. & Hari, R. Spatiotemporal characteristics of sensorimotor neuromagnetic rhythms related to thumb movement. Neuroscience 60, 537–550 (1994).

  19. 19.

    Neuper, C., Wörtz, M. & Pfurtscheller, G. ERD/ERS patterns reflecting sensorimotor activation and deactivation. Prog. Brain. Res. 159, 211–222 (2006).

  20. 20.

    Jenkinson, N. & Brown, P. New insights into the relationship between dopamine, beta oscillations and motor function. Trends Neurosci. 34, 611–618 (2011).

  21. 21.

    Wolff, M. J., Jochim, J., Akyürek, E. G. & Stokes, M. G. Dynamic hidden states underlying working-memory-guided behavior. Nat. Neurosci. 20, 864–871 (2017).

  22. 22.

    van Ede, F., Chekroud, S. R., Stokes, M. G. & Nobre, A. C. Decoding the influence of anticipatory states on visual perception in the presence of temporal distractors. Nat. Commun. 9, 1449 (2018).

  23. 23.

    Stokes, M. G., Wolff, M. J. & Spaak, E. Decoding rich spatial information with high temporal resolution. Trends. Cogn. Sci. 19, 636–638 (2015).

  24. 24.

    Kriegeskorte, N., Goebel, R. & Bandettini, P. Information-based functional brain mapping. Proc. Natl Acad. Sci. USA 103, 3863–3868 (2006).

  25. 25.

    Miller, J., Patterson, T. & Ulrich, R. Jackknife-based method for measuring LRP onset latency differences. Psychophysiology 35, 99–115 (1998).

  26. 26.

    Chun, M. M. & Jiang, Y. Contextual cueing: implicit learning and memory of visual context guides spatial attention. Cognit. Psychol. 36, 28–71 (1998).

  27. 27.

    Summerfield, J. J., Lepsien, J., Gitelman, D. R., Mesulam, M. M. & Nobre, A. C. Orienting attention based on long-term memory experience. Neuron 49, 905–916 (2006).

  28. 28.

    de Vries, I. E. J., van Driel, J. & Olivers, C. N. L. Posterior α EEG dynamics dissociate current from future goals in working memory-guided visual search. J. Neurosci. 37, 1591–1603 (2017).

  29. 29.

    Rainer, G., Rao, S. C. & Miller, E. K. Prospective coding for objects in primate prefrontal cortex. J. Neurosci. 19, 5493–5505 (1999).

  30. 30.

    Myers, N. E., Stokes, M. G. & Nobre, A. C. Prioritizing information during working memory: beyond sustained internal attention. Trends. Cogn. Sci. 21, 449–461 (2017).

  31. 31.

    Svoboda, K. & Li, N. Neural mechanisms of movement planning: motor cortex and beyond. Curr. Opin. Neurobiol. 49, 33–41 (2018).

  32. 32.

    Gallivan, J. P. et al. One to four, and nothing more: nonconscious parallel individuation of objects during action planning. Psychol. Sci. 22, 803–811 (2011).

  33. 33.

    Gallivan, J. P., Logan, L., Wolpert, D. M. & Flanagan, J. R. Parallel specification of competing sensorimotor control policies for alternative action options. Nat. Neurosci. 19, 320–326 (2016).

  34. 34.

    Gallivan, J. P., Barton, K. S., Chapman, C. S., Wolpert, D. M. & Flanagan, J. R. Action plan co-optimization reveals the parallel encoding of competing reach movements. Nat. Commun. 6, 7428 (2015).

  35. 35.

    Gallivan, J. P., Bowman, N. A. R., Chapman, C. S., Wolpert, D. M. & Flanagan, J. R. The sequential encoding of competing action goals involves dynamic restructuring of motor plans in working memory. J. Neurophysiol. 115, 3113–3122 (2016).

  36. 36.

    Cisek, P. Cortical mechanisms of action selection: the affordance competition hypothesis. Philos. Trans. R. Soc. B, Biol. Sci. 362, 1585–1599 (2007).

  37. 37.

    Postle, B. R., Idzikowski, C., Sala, S. D., Logie, R. H. & Baddeley, A. D. The selective disruption of spatial working memory by eye movements. Q. J. Exp. Psychol. 59, 100–120 (2006).

  38. 38.

    Theeuwes, J., Belopolsky, A. & Olivers, C. N. L. Interactions between working memory, attention and eye movements. Acta Psychol. 132, 106–114 (2009).

  39. 39.

    Jerde, T. A., Merriam, E. P., Riggall, A. C., Hedges, J. H. & Curtis, C. E. Prioritized maps of space in human frontoparietal cortex. J. Neurosci. 32, 17382–17390 (2012).

  40. 40.

    Merrikhi, Y. et al. Spatial working memory alters the efficacy of input to visual cortex. Nat. Commun. 8, 15041 (2017).

  41. 41.

    Deubel, H. & Schneider, W. X. Saccade target selection and object recognition: evidence for a common attentional mechanism. Vision Res. 36, 1827–1837 (1996).

  42. 42.

    Nobre, A. C. et al. Functional localization of the system for visuospatial attention using positron emission tomography. Brain 120, 515–533 (1997).

  43. 43.

    Moore, T., Armstrong, K. M. & Fallah, M. Visuomotor origins of covert spatial attention. Neuron 40, 671–683 (2003).

  44. 44.

    Oostenveld, R., Fries, P., Maris, E. & Schoffelen, J.-M. FieldTrip: Open source software for advanced analysis of MEG, EEG and invasive electrophysiological data. Intell. Neurosci. https://doi.org/10.1155/2011/156869 (2011).

  45. 45.

    Perrin, F., Pernier, J., Bertrand, O. & Echallier, J. F. Spherical splines for scalp potential and current density mapping. Electroencephalogr. Clin. Neurophysiol. 72, 184–187 (1989).

  46. 46.

    Oostendorp, T. F. & van Oosterom, A. Source parameter estimation in inhomogeneous volume conductors of arbitrary shape. IEEE. Trans. Biomed. Eng. 36, 382–391 (1989).

  47. 47.

    Gross, J. et al. Dynamic imaging of coherent sources: studying neural interactions in the human brain. Proc. Natl Acad. Sci. USA 98, 694–699 (2001).

  48. 48.

    Van Veen, B. D. & Buckley, K. M. Beamforming: a versatile approach to spatial filtering. IEEE ASSP Mag. 5, 4–24 (1988).

  49. 49.

    van Ede, F. & Maris, E. Physiological plausibility can increase reproducibility in cognitive neuroscience. Trends. Cogn. Sci. 20, 567–569 (2016).

  50. 50.

    Maris, E. & Oostenveld, R. Nonparametric statistical testing of EEG- and MEG-data. J. Neurosci. Methods 164, 177–190 (2007).

Download references

Acknowledgements

This research was funded by a Marie Skłodowska-Curie Fellowship from the European Commission (ACCESS2WM) to F.v.E., a Wellcome Trust Senior Investigator Award (104571/Z/14/Z) and a James S. McDonnell Foundation Understanding Human Cognition Collaborative Award (220020448) to A.C.N., a Medical Research Council Career Development Award (MR/J009024/1) and James S. McDonnell Foundation Scholar Award (220020405) to M.G.S, and by the NIHR Oxford Health Biomedical Research Centre. The Wellcome Centre for Integrative Neuroimaging is supported by core funding from the Wellcome Trust (203139/Z/16/Z).

Author information

Affiliations

  1. Oxford Centre for Human Brain Activity, Wellcome Centre for Integrative Neuroimaging, Department of Psychiatry, University of Oxford, Oxford, UK

    • Freek van Ede
    • , Sammi R. Chekroud
    •  & Anna C. Nobre
  2. Department of Experimental Psychology, University of Oxford, Oxford, UK

    • Sammi R. Chekroud
    • , Mark G. Stokes
    •  & Anna C. Nobre

Authors

  1. Search for Freek van Ede in:

  2. Search for Sammi R. Chekroud in:

  3. Search for Mark G. Stokes in:

  4. Search for Anna C. Nobre in:

Contributions

F.v.E. designed and programmed the experiment, acquired, analyzed and interpreted the data and drafted and revised the manuscript. S.R.C. acquired and interpreted the data. M.G.S. interpreted the data and drafted and revised the manuscript. A.C.N. designed the experiment, interpreted the data, and drafted and revised the manuscript.

Competing interests

The authors declare no competing interests.

Corresponding author

Correspondence to Freek van Ede.

Integrated supplementary information

  1. Supplementary Figure 1 Specificity of spectral signatures of visual and motor selection to visual and motor electrodes.

    a-d) Time-frequency maps of spectral lateralization relative to item location (top row) or response hand (bottom row) in selected visual (left) and motor (right) electrode clusters. Panels a and d are identical to Fig. 2a and Fig. 2d. Results in all panels depict the average across all participants (n = 25).

  2. Supplementary Figure 2 Spectral signatures of visual and motor selection commence before response initiation at highly similar times.

    a–d) Time-frequency maps of spectral lateralization relative to item location (top row) or response hand (bottom row), aligned to the time of the memory probe (left) or the time or response initiation (right). Visual selection is calculated from selected visual electrodes and motor selection is calculated using selected motor electrodes. Panels a and c are identical to Fig. 2a and Fig. 2d. Results in all panels depict the average across all participants (n = 25).

  3. Supplementary Figure 3 Spectral signatures of visual and motor selection also show parallel alignment to response initiation.

    a–d) Left column shows spectral data aligned to the time of the memory probe (identical to left column of Fig. 4), while the right columns shows the same data aligned to the time of response initiation. Conventions as in Fig. 4. Statistics were based on n = 25 participants.

  4. Supplementary Figure 4 Decoding results for three complementary decoding metrics.

    The data on the left are identical to Fig. 3a. The middle panel shows single-trial decoding accuracy. To obtain decoding accuracy, we ‘binarised’ the single-trial Mahalanobis distance data—marking a 1 where the test trial was more similar to the remaining training trials that belonged to the same class, and 0 otherwise (yielding a chance-level of 0.5). The right panel shows the across-trial independent samples t-statistic, calculated using the Mahalanobis distances relative to the same vs. the opposite training class (opposite minus same, to yield positive decoding values). T-values were calculated separately for each participant and the graph shows the group average of these participant-specific t-values. Horizontal bars indicate significant clusters (two-sided cluster-based permutation tests; all cluster-P < 0.0001; 10.000 permutations; n = 25). Shading represents ± 1 s.e.m, calculated across participants (n = 25).

  5. Supplementary Figure 5 Decoding cross-generalizes to trials with opposite sides in the complementary visuomotor dimension.

    Left panel shows non-cross-generalization decoding where we trained to distinguish one dimension (for example left vs. right visual location), while keeping the value in the other dimension (response hand) constant between training in testing sets. Results show the average of both permutations (for example visual location decoding once for the left response hand trials, and once for the right response hand trials). The right panel shows the results from the more interesting cross-generalization case where we trained on one dimension (left vs. right visual location) while varying the other dimension (response hand) between test and training sets. This revealed that the ‘left vs. right visual code’ is largely independent of the actual response hand (magenta), and that the ‘left vs. right motor code’ is largely independent of the actual item location (cyan). Horizontal bars indicate significant clusters (two-sided cluster-based permutation tests; all cluster-P < 0.0001; 10.000 permutations; n = 25). Shading represents ± 1 s.e.m, calculated across participants (n = 25).

  6. Supplementary Figure 6 Replication of visual selection signatures in a prior dataset in which responses were always made with the same (dominant) hand.

    a) Spectral lateralization (contralateral minus ipsilateral) in selected visual electrodes relative to the memorized location of the probed item. The black outline indicates the significant time-frequency cluster (two-sided cluster-based permutation test; cluster-P = 6.9e−4). b) Corresponding alpha topography (left vs. right). c) Time course of the decoding of the memorized location of the probed item. Horizontal bar indicates significant cluster (two-sided cluster-based permutation test; cluster-P < 0.0001; 10.000 permutations; n = 24). d) Corresponding decoding topography. All analyses are analogous to those reported in our Method section. Conventions as in Figs. 2, 3. The basic elements of the task from which these data came were similar to the current task, with the exceptions that the starting orientation of the central response-dial could not be predicted and that orientation dial-up was always performed with participants’ dominant hand, using the mouse. For further details please refer to 16. Number of participants in this dataset was 24. Results in all panels depict the average across all participants (n = 24).Shading in panel c represent ± 1 s.e.m, calculated across participants (n = 24).

  7. Supplementary Figure 7 Visual and motor selection as a function of response onset time.

    a) Visual and motor selection signatures for data sorted into five bins (0–20, 20–40, 40–60, 60–80, 80–100%) of increasing response onset times. Average response onset time in each bin is indicated above each panel and by the colored vertical lines. Top row shows spectral signatures (cf Fig. 2), while bottom row shows decoding signatures (cf Fig. 3). Solid lines show visual selection signatures (alpha lateralization in selected visual channels / decoding of item location) while dashed lines show motor selection signatures (beta lateralization in selected motor channels / decoding of response hand). Decoding was always evaluated against the same training set that included all trials—to increase comparability and to ensure sufficient trial numbers. b) To quantify whether both visual and motor signatures scaled similarly with response onset time, we calculated temporal correlations (Pearson’s r) between visual and motor selection time courses in each bin and compared the average correlation (averaged across bins) to the average correlation observed when reversing the order of the bins between the visual and motor selection signatures (that is correlating the visual selection signature of bin 1, with the motor selection signature of bin 5, and so on). Temporal correlations were significantly higher when preserving bin order (denoted ‘within rt-bins’) vs. when reversing the order (denoted ‘between rt-bins’): spectral data: t(24) = 4.56, P = 1.237e-4, Cohen’s d = 0.912; decoding data: t(24) = 2.795, P = 0.01, Cohen’s d = 0.559 (paired samples t-tests; two-sided; n = 25). Grey lines show individual participant data. * P < 0.05; *** P < 0.001. Error bars represent ± 1 s.e.m, calculated across participants (n = 25), and are centered on the group means. c) As a bonus, panel a also revealed spectral lateralization differences between bins, that already started in the pre-probe baseline—in line with the notion that ‘spontaneous’ priority states before the probe may determine the subsequent response to this probe. To zoom in on this effect, panel c overlays the lateralization in visual and motor channels (relative to, respectively, the visual location and the associated response hand of the item that would subsequently be probed) for the different response-time bins. This revealed that fastest trials (blue) were not only associated with lower alpha power in visual sites contra vs. ipsilateral to the location of the to-be-probed item (left panel) but also with lower beta power in motor sites contra vs. ipsilateral to the response hand associated with the to-be-probed item (right panel). Horizontal bars indicate significant clusters with the black horizontal line denoting the parametric effect across the five bins (two-sided cluster-based permutation tests; all cluster-P < 0.05; 10.000 permutations; n = 25). Shading represents ± 1 s.e.m, calculated across participants (n = 25). Results in all panels depict the average of all participants (n = 25).

Supplementary information

About this article

Publication history

Received

Accepted

Published

DOI

https://doi.org/10.1038/s41593-018-0335-6