Skip to main content

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

  • Article
  • Published:

A dynamic sequence of visual processing initiated by gaze shifts

Abstract

Animals move their head and eyes as they explore the visual scene. Neural correlates of these movements have been found in rodent primary visual cortex (V1), but their sources and computational roles are unclear. We addressed this by combining head and eye movement measurements with neural recordings in freely moving mice. V1 neurons responded primarily to gaze shifts, where head movements are accompanied by saccadic eye movements, rather than to head movements where compensatory eye movements stabilize gaze. A variety of activity patterns followed gaze shifts and together these formed a temporal sequence that was absent in darkness. Gaze-shift responses resembled those evoked by sequentially flashed stimuli, suggesting a large component corresponds to onset of new visual input. Notably, neurons responded in a sequence that matches their spatial frequency bias, consistent with coarse-to-fine processing. Recordings in freely gazing marmosets revealed a similar sequence following saccades, also aligned to spatial frequency preference. Our results demonstrate that active vision in both mice and marmosets consists of a dynamic temporal sequence of neural activity associated with visual sampling.

This is a preview of subscription content, access via your institution

Access options

Buy this article

Prices may be subject to local taxes which are calculated during checkout

Fig. 1: V1 neurons preferentially respond to gaze-shifting eye/head movements.
Fig. 2: Diversity and temporal sequence of gaze-shift responses.
Fig. 3: Temporal dynamics of gaze-shift responses depend on visual input.
Fig. 4: Head-fixed flashed stimulus responses resemble freely moving gaze-shift responses.
Fig. 5: SF tuning demonstrates coarse-to-fine processing around gaze shifts.
Fig. 6: A similar temporal sequence of V1 saccade responses in freely gazing marmosets.
Fig. 7: A coarse-to-fine temporal sequence in freely gazing marmosets.

Similar content being viewed by others

Data availability

Data are available on Dryad: https://doi.org/10.5061/dryad.kd51c5bck. These are the minimum data required to reproduce the figures in this publication using the ‘analysis and figure generation’ code linked in the ‘Code availability’ section below.

Code availability

All original code is publicly available. Mouse data preprocessing: https://github.com/nielllab/FreelyMovingEphys; analysis and figure generation: https://github.com/nielllab/freely-moving-saccades; marmoset stimulus analysis: https://github.com/jcbyts/MarmoV5.

References

  1. Boi, M., Poletti, M., Victor, J. D. & Rucci, M. Consequences of the oculomotor cycle for the dynamics of perception. Curr. Biol. 27, 1268–1277 (2017).

    CAS  PubMed  PubMed Central  Google Scholar 

  2. Ahissar, E. & Arieli, A. Figuring space by time. Neuron 32, 185–201 (2001).

    CAS  PubMed  Google Scholar 

  3. Schroeder, C. E., Wilson, D. A., Radman, T., Scharfman, H. & Lakatos, P. Dynamics of active sensing and perceptual selection. Curr. Opin. Neurobiol. 20, 172–176 (2010).

    CAS  PubMed  PubMed Central  Google Scholar 

  4. Gibson, J. J. The Ecological Approach to Visual Perception: Classic Edition (Psychology Press, 1979).

  5. Parker, P. R. L., Brown, M. A., Smear, M. C. & Niell, C. M. Movement-related signals in sensory areas: roles in natural behavior. Trends Neurosci. 43, 581–595 (2020).

    CAS  PubMed  PubMed Central  Google Scholar 

  6. Burchfiel, J. L. & Duffy, F. H. Corticofugal influence upon cat thalamic ventrobasal complex. Brain Res. 70, 395–411 (1974).

    CAS  PubMed  Google Scholar 

  7. Leopold, D. A. & Logothetis, N. K. Microsaccades differentially modulate neural activity in the striate and extrastriate visual cortex. Exp. Brain Res. 123, 341–345 (1998).

  8. Nishimoto, S., Huth, A. G., Bilenko, N. Y. & Gallant, J. L. Eye movement-invariant representations in the human visual system. J. Vis. 17, 11 (2017).

    PubMed  PubMed Central  Google Scholar 

  9. Noda, H. & Adey, W. R. Retinal ganglion cells of the cat transfer information on saccadic eye movement and quick target motion. Brain Res. 70, 340–346 (1974).

    CAS  PubMed  Google Scholar 

  10. Sommer, M. A. & Wurtz, R. H. Brain circuits for the internal monitoring of movements. Annu. Rev. Neurosci. 31, 317–338 (2008).

    CAS  PubMed  PubMed Central  Google Scholar 

  11. Miura, S. K. & Scanziani, M. Distinguishing externally from saccade-induced motion in visual cortex. Nature 610, 135–142 (2022).

    CAS  PubMed  PubMed Central  Google Scholar 

  12. Niell, C. M. & Stryker, M. P. Modulation of visual responses by behavioral state in mouse visual cortex. Neuron 65, 472–479 (2010).

    CAS  PubMed  PubMed Central  Google Scholar 

  13. Stringer, C. et al. Spontaneous behaviors drive multidimensional, brainwide activity. Science 364, 255 (2019).

    PubMed  PubMed Central  Google Scholar 

  14. Musall, S., Kaufman, M. T., Juavinett, A. L., Gluf, S. & Churchland, A. K. Single-trial neural dynamics are dominated by richly varied movements. Nat. Neurosci. 22, 1677–1686 (2019).

    CAS  PubMed  PubMed Central  Google Scholar 

  15. Vélez-Fort, M. et al. A circuit for integration of head- and visual-motion signals in layer 6 of mouse primary visual cortex. Neuron 98, 179–191.e6 (2018).

    PubMed  PubMed Central  Google Scholar 

  16. Meyer, A. F., Poort, J., O’Keefe, J., Sahani, M. & Linden, J. F. A head-mounted camera system integrates detailed behavioral monitoring with multichannel electrophysiology in freely moving mice. Neuron 100, 46–60.e7 (2018).

    CAS  PubMed  PubMed Central  Google Scholar 

  17. Bouvier, G., Senzai, Y. & Scanziani, M. Head movements control the activity of primary visual cortex in a luminance-dependent manner. Neuron 108, 500–511.e5 (2020).

    CAS  PubMed  PubMed Central  Google Scholar 

  18. Guitchounts, G., Masís, J., Wolff, S. B. E. & Cox, D. Encoding of 3D head orienting movements in the primary visual cortex. Neuron 108, 512–525.e4 (2020).

    CAS  PubMed  Google Scholar 

  19. Liska, J. P. et al. Running modulates primate and rodent visual cortex differently. eLife 12, RP87736 (2023).

    Google Scholar 

  20. Hegdé, J. Time course of visual perception: coarse-to-fine processing and beyond. Prog. Neurobiol. 84, 405–439 (2008).

    PubMed  Google Scholar 

  21. Michaiel, A. M., Abe, E. T. & Niell, C. M. Dynamics of gaze control during prey capture in freely moving mice. eLife 9, e57458 (2020).

    CAS  PubMed  PubMed Central  Google Scholar 

  22. Parker, P. R., Abe, E. T., Leonard, E. S., Martins, D. M. & Niell, C. M. Joint coding of visual input and eye/head position in V1 of freely moving mice. Neuron 110, 3897–3906 (2022).

    CAS  PubMed  Google Scholar 

  23. Land, M. Eye movements in man and other animals. Vis. Res. 162, 1–7 (2019).

    PubMed  Google Scholar 

  24. Meyer, A. F., O’Keefe, J. & Poort, J. Two distinct types of eye-head coupling in freely moving mice. Curr. Biol. 30, 2116–2130.e6 (2020).

    CAS  PubMed  PubMed Central  Google Scholar 

  25. King, C. W., Ledochowitsch, P., Buice, M. A. & de Vries, S. E. Saccade-responsive visual cortical neurons do not exhibit distinct visual response properties. eNeuro 10, ENEURO.0051-23.2023 (2023).

    PubMed  PubMed Central  Google Scholar 

  26. Ibbotson, M. & Krekelberg, B. Visual perception and saccadic eye movements. Curr. Opin. Neurobiol. 21, 553–558 (2011).

    CAS  PubMed  PubMed Central  Google Scholar 

  27. Seabrook, T. A., Burbridge, T. J., Crair, M. C. & Huberman, A. D. Architecture, function, and assembly of the mouse visual system. Annu. Rev. Neurosci. 40, 499–538 (2017).

    CAS  PubMed  Google Scholar 

  28. Huberman, A. D. & Niell, C. M. What can mice tell us about how vision works? Trends Neurosci. 34, 464–473 (2011).

    CAS  PubMed  PubMed Central  Google Scholar 

  29. Ringach, D. L., Sapiro, G. & Shapley, R. A subspace reverse-correlation technique for the study of visual neurons. Vis. Res. 37, 2455–2464 (1997).

    CAS  PubMed  Google Scholar 

  30. Bleckert, A., Schwartz, G. W., Turner, M. H., Rieke, F. & Wong, R. O. L. Visual space is represented by nonmatching topographies of distinct mouse retinal ganglion cell types. Curr. Biol. 24, 310–315 (2014).

    CAS  PubMed  PubMed Central  Google Scholar 

  31. van Beest, E. H. et al. Mouse visual cortex contains a region of enhanced spatial resolution. Nat. Commun. 12, 4029 (2021).

    PubMed  PubMed Central  Google Scholar 

  32. Zahler, S. H., Taylor, D. E., Wong, J. Y., Adams, J. M. & Feinberg, E. H. Superior colliculus drives stimulus-evoked directionally biased saccades and attempted head movements in head-fixed mice. eLife 10, 73081 (2021).

  33. Holmgren, C. D. et al. Visual pursuit behavior in mice maintains the pursued prey on the retinal region with least optic flow. eLife 10, 70838 (2021).

  34. Gallant, J. L., Connor, C. E. & Van Essen, D. C. Neural activity in areas V1, V2 and V4 during free viewing of natural scenes compared to controlled viewing. NeuroReport 9, 1673–1678 (1998).

    CAS  PubMed  Google Scholar 

  35. Angelaki, D. E. & Cullen, K. E. Vestibular system: the many facets of a multimodal sense. Annu. Rev. Neurosci. 31, 125–150 (2008).

    CAS  PubMed  Google Scholar 

  36. Medrea, I. & Cullen, K. E. Multisensory integration in early vestibular processing in mice: the encoding of passive vs. active motion. J. Neurophysiol. 110, 2704–2717 (2013).

    PubMed  PubMed Central  Google Scholar 

  37. Duffy, F. H. & Burchfiel, J. L. Eye movement-related inhibition of primate visual neurons. Brain Res. 89, 121–132 (1975).

    CAS  PubMed  Google Scholar 

  38. Adey, W. R. & Noda, H. Influence of eye movements on geniculo-striate excitability in the cat. J. Physiol. 235, 805–821 (1973).

    CAS  PubMed  PubMed Central  Google Scholar 

  39. Toyama, K., Kimura, M. & Komatsu, Y. Activity of the striate cortex cells during saccadic eye movements of the alert cat. Neurosci. Res. 1, 207–222 (1984).

    CAS  PubMed  Google Scholar 

  40. Navon, D. Forest before trees: the precedence of global features in visual perception. Cogn. Psychol. 9, 353–383 (1977).

    Google Scholar 

  41. Oliva, A. & Schyns, P. G. Coarse blobs or fine edges? Evidence that information diagnosticity changes the perception of complex visual stimuli. Cogn. Psychol. 34, 72–107 (1997).

    CAS  PubMed  Google Scholar 

  42. Petras, K., Ten Oever, S., Jacobs, C. & Goffaux, V. Coarse-to-fine information integration in human vision. Neuroimage 186, 103–112 (2019).

    PubMed  Google Scholar 

  43. Bredfeldt, C. E. & Ringach, D. L. Dynamics of spatial frequency tuning in macaque V1. J. Neurosci. 22, 1976–1984 (2002).

    CAS  PubMed  PubMed Central  Google Scholar 

  44. Skyberg, R., Tanabe, S., Chen, H. & Cang, J. Coarse-to-fine processing drives the efficient coding of natural scenes in mouse visual cortex. Cell Rep. 38, 110606 (2022).

    CAS  PubMed  PubMed Central  Google Scholar 

  45. Mazer, J. A., Vinje, W. E., McDermott, J., Schiller, P. H. & Gallant, J. L. Spatial frequency and orientation tuning dynamics in area V1. Proc. Natl Acad. Sci. USA 99, 1645–1650 (2002).

    CAS  PubMed  PubMed Central  Google Scholar 

  46. Purushothaman, G., Chen, X., Yampolsky, D. & Casagrande, V. A. Neural mechanisms of coarse-to-fine discrimination in the visual cortex. J. Neurophysiol. 112, 2822–2833 (2014).

    PubMed  PubMed Central  Google Scholar 

  47. Vreysen, S., Zhang, B., Chino, Y. M., Arckens, L. & Van den Bergh, G. Dynamics of spatial frequency tuning in mouse visual cortex. J. Neurophysiol. 107, 2937–2949 (2012).

    PubMed  PubMed Central  Google Scholar 

  48. Marr, D. Vision: A Computational Investigation into the Human Representation and Processing of Visual Information (MIT Press, 2010).

  49. Rucci, M., Iovin, R., Poletti, M. & Santini, F. Miniature eye movements enhance fine spatial detail. Nature 447, 851–854 (2007).

    CAS  PubMed  Google Scholar 

  50. Chen, Y. & Qian, N. A coarse-to-fine disparity energy model with both phase-shift and position-shift receptive field mechanisms. Neural Comput. 16, 1545–1577 (2004).

    PubMed  Google Scholar 

  51. Allen, E. A. & Freeman, R. D. Dynamic spatial processing originates in early visual pathways. J. Neurosci. 26, 11763–11774 (2006).

    CAS  PubMed  PubMed Central  Google Scholar 

  52. Piscopo, D. M., El-Danaf, R. N., Huberman, A. D. & Niell, C. M. Diverse visual features encoded in mouse lateral geniculate nucleus. J. Neurosci. 33, 4642–4656 (2013).

    CAS  PubMed  PubMed Central  Google Scholar 

  53. Gao, E., DeAngelis, G. C. & Burkhalter, A. Parallel input channels to mouse primary visual cortex. J. Neurosci. 30, 5912–5926 (2010).

    CAS  PubMed  PubMed Central  Google Scholar 

  54. Wekselblatt, J. B., Flister, E. D., Piscopo, D. M. & Niell, C. M. Large-scale imaging of cortical dynamics during sensory perception and behavior. J. Neurophysiol. 115, 2852–2866 (2016).

    CAS  PubMed  PubMed Central  Google Scholar 

  55. Lopes, G. et al. Bonsai: an event-based framework for processing and controlling data streams. Front. Neuroinform. 9, 7 (2015).

    PubMed  PubMed Central  Google Scholar 

  56. Kleiner, M., Brainard, D. & Pelli, D. What’s new in Psychtoolbox-3? Perception 36 ECVP Abstract Supplement (2007).

  57. Niell, C. M. & Stryker, M. P. Highly selective receptive fields in mouse visual cortex. J. Neurosci. 28, 7520–7536 (2008).

    CAS  PubMed  PubMed Central  Google Scholar 

  58. Senzai, Y., Fernandez-Ruiz, A. & Buzsáki, G. Layer-specific physiological features and interlaminar interactions in the primary visual cortex of the mouse. Neuron 101, 500–513 (2019).

    CAS  PubMed  PubMed Central  Google Scholar 

  59. Mathis, A. et al. DeepLabCut: markerless pose estimation of user-defined body parts with deep learning. Nat. Neurosci. 21, 1281–1289 (2018).

    CAS  PubMed  Google Scholar 

  60. Dayan, P. & Abbott, L. F. Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems (MIT Press, 2005).

  61. Mitchell, J. F., Reynolds, J. H. & Miller, C. T. Active vision in marmosets: a model system for visual neuroscience. J. Neurosci. 34, 1183–1194 (2014).

    CAS  PubMed  PubMed Central  Google Scholar 

  62. Nummela, S. U. et al. Psychophysical measurement of marmoset acuity and myopia. Dev. Neurobiol. 77, 300–313 (2017).

    PubMed  Google Scholar 

  63. Spitler, K. M. & Gothard, K. M. A removable silicone elastomer seal reduces granulation tissue growth and maintains the sterility of recording chambers for primate neurophysiology. J. Neurosci. Methods 169, 23–26 (2008).

    CAS  PubMed  Google Scholar 

  64. Yates, J. L. et al. Detailed characterization of neural selectivity in free viewing primates. Nat. Commun. 14, 3656 (2023).

  65. Cloherty, S. L., Yates, J. L., Graf, D., DeAngelis, G. C. & Mitchell, J. F. Motion perception in the common marmoset. Cereb. Cortex 30, 2658–2672 (2020).

    PubMed  Google Scholar 

  66. Eastman, K. M. & Huk, A. C. PLDAPS: a hardware architecture and software toolbox for neurophysiology requiring complex visual stimuli and online behavioral control. Front. Neuroinform. 6, 1 (2012).

    PubMed  PubMed Central  Google Scholar 

  67. Senzai, Y. & Scanziani, M. A cognitive process occurring during sleep is revealed by rapid eye movements. Science 377, 999–1004 (2022).

    CAS  PubMed  PubMed Central  Google Scholar 

Download references

Acknowledgements

We thank A. Huk, C. Miller, D. Leopold, K. Bieszczad, M. Goard and members of the Niell and Mitchell laboratory for conversations and feedback on the manuscript. This work was supported by National Institutes of Health grants no. UF1NS116377 (C.M.N. and J.F.M.), no. R01NS121919-01 (C.M.N.), no. 4R00EY032179-03 (J.L.Y.) and no. R01EY030998-02 (J.F.M.).

Author information

Authors and Affiliations

Authors

Contributions

P.R.L.P. and C.M.N. conceived the project. P.R.L.P. and E.S.P.L. led mouse experiments. D.M.M. and C.M.N. led data analysis. N.M.C. and S.L.S. contributed to mouse experiments. E.T.T.A. contributed to data analysis. M.C.S. generated the audio track from mouse neural activity. J.L.Y. and J.F.M. performed marmoset experiments. J.L.Y., J.F.M. and D.M.M. performed marmoset data analysis. All authors contributed to writing and editing the manuscript.

Corresponding authors

Correspondence to Jude F. Mitchell or Cristopher M. Niell.

Ethics declarations

Competing interests

The authors declare no competing interests.

Peer review

Peer review information

Nature Neuroscience thanks Aman Saleem and the other, anonymous, reviewer(s) for their contribution to the peer review of this work.

Additional information

Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Extended data

Extended Data Fig. 1 Characterization of free movement.

a. Mean head pitch and roll during free motion for one example recording. Pitch mean = −15.1 ± 0.02 deg; Roll mean = 9.2 ± 0.01 deg. b. Mean head pitch and roll, indicating the center point during free movement. Each point is a mouse (n = 9 mice). Black bars indicate mean and standard error (pitch: −20.8 ± 3.1 deg; roll: 4.5 ± 3.3 deg). Recording from a shown in orange. c. Rate of gaze-shifting (left; run median = 106 ± 6 saccades/min; still median = 50 ± 3 saccades/min) and compensatory (right; run median = 362 ± 11 saccades/min; still median = 229 ± 15 saccades/min) movements during periods of locomotion greater than 2 cm/s measured from the top camera (‘run’) compared to periods of slower locomotion, fine motion, and/or stationary periods less than 2 cm/s (‘still’). Each point is a mouse (n = 9 mice). Black bars indicate median and standard error. d. Amplitude of position change for eye (left), head (middle) and gaze (right; defined as eye + head) during gaze-shifting and compensatory eye/head movements at the onset of the movement for the example recording used in a. e. Scatter plot of eye and head velocities subsampled (25x) from the example recording used in a, showing compensatory, gaze-shifting, and intermediate movements, the latter of which are excluded from the analysis in the main text. f. Amplitude of gaze changes at onset of movement for the example recording used in a. g. Median ± SEM amplitude of gaze change for all recordings. Each recording is a point (n = 9 mice). Recording in f is shown in orange. Compensatory: 0.73 ± 0.02 deg; intermediate: 2.63 ± 0.01 deg; gaze-shifting: 8.76 ± 0.29 deg. h. PETHs for example cells from Fig. 1g including PETH for responses to intermediate saccades in black. i. Normalized PETHs of gaze-shifting (left), intermediate (middle), and compensatory (right) eye/head movements for 100 example units with a baseline firing rate >2 Hz, with median of all cells (n = 716) overlaid.

Extended Data Fig. 2 Additional characterization of gaze shift response types.

a. PCA of gaze shift PETHs. Only the two PCs with the highest explained variance are shown. Cells in the scatter plot are colored by the cluster they were assigned by k-means clustering of PCs. b. Fraction of units in each gaze shift response cluster. c. Latency of peak responses were significantly different for all comparisons between clusters (p < 0.05 with no effect of experimental session p = 0.220, linear mixed effects model; early vs. late p = 7.10e-42, early vs. biphasic p = 3.86e-116, early vs. negative p = 3.83e-162, late vs. biphasic p = 2.32e-25, late vs. negative p = 2.30e-65, biphasic vs. negative p = 6.67e-22). d. Fraction of putative cell types in each gaze shift response cluster. Excitatory and inhibitory groups were identified by k-means clustering on spike waveforms (waveforms shown above). e. Median ± SEM baseline firing rate of units during freely moving (left) and head-fixed (right) recordings (n = 9 mice, n = 716 cells). Freely moving baseline was calculated as the pre-saccadic period before gaze shifts. Head-fixed baseline was calculated as the firing rate during presentation of gray screen during head-fixation. f. Scatter plot of head-fixed and freely moving baseline firing rates. Each point is a cell. Linear regression shown as dashed black line. (early: r = 0.87, p = 1.01e-26, m = 1.94; late: r = 0.70, p = 7.06e21, m = 1.33; biphasic: r = 0.70, p = 3.01e-26, m = 1.01; negative: r = 0.69, p = 1.45e-10, m = 0.97). g. Gaze shift left/right direction selectivity index by cluster. h. Laminar depth of all cells determined using the local field potential from multi-unit activity power along each shank of the probe. Black outline shows the distribution of depths for all cells. Dashed line (0 μm) is the estimated depth of cortical layer 5, to which depths were aligned. i. Normalized horizontal angular velocity tuning for all cells, separated by response clusters. Positive values for angular velocity represent each unit’s preferred horizontal direction of gaze shift. j. PETH for compensatory eye/head movements for cells responsive to compensatory movements (n = 48/716). Only the preferred direction is shown. Responsiveness defined as 10% modulation and modulation by at least 1 sp/s. k. Percent of each gaze shift response cluster that is responsive to compensatory movements (total=48/716, early=4/82, late=5/135, biphasic=17/170, negative=15/66, unresponsive=7/263) l. Same as j grouped by gaze shift response cluster.

Extended Data Fig. 3 Cross validation of response latencies.

a. Cross-validation for mouse gaze shift PETHs of all responsive cells. Gaze shift times were randomly divided into two sets used to calculate PETHs in the train (left) and test (right) sets. The test set was sorted by the latency of the positive peak in the train set. b. Latency of gaze shift response for train versus test sets (r = 0.870, p = 2.51e-140). c. Same as a for marmoset saccades. d. Same as b for marmoset saccades (r = 0.875, p = 1.44e-106).

Extended Data Fig. 4 Additional characterization of responses in the dark.

a. Fraction of cells responsive in the dark condition (responsive=9/269, unresponsive=260/269). b. Dark condition PETHs for cells that responded to gaze shifts in freely moving dark conditions. Units are colored by clustering from responses in light condition. n = 9/269 (early=5, late=1, biphasic=0, negative=1, unresponsive=2). c. Same as a for the light condition (responsive=191/269, unresponsive=78/269). d. Responses of units in b for the light condition.

Extended Data Fig. 5 Additional characterization of drifting gratings responses.

a. Head-fixed drifting gratings PETHs for gaze shift response clusters with mean response overlayed. Stimulus is presented for 1 s with gray ISI between stimuli. n = 9 mice, n = 384/716 cells responsive to gratings (early=71, late=96, biphasic=98, negative=29, unresponsive=90). Cells below firing rate threshold are not shown. b. Mean ± SEM normalized gratings PETHs clustered by gaze shift response for full stimulus presentation (top) and highlighting stimulus onset (bottom). c. Fraction of cells in each cluster with a ≥ 2:1 preference for the presented spatial frequencies compared to the sum of responses for the two other spatial frequencies. d. Mean ± SEM temporal frequency tuning curve by cluster (Multivariate two-way ANOVA, TF x cluster F = 21.45, p = 3.45e-13). e. Temporal frequency preference for gratings-responsive cells in each gaze shift response cluster, calculated as a weighted mean of responses (n = 9 mice, 384 cells). Median and standard error are shown for each cluster. Bars above indicate statistical significance at p < 0.05 (linear mixed effects model, n = 9 mice, n = 384 cells; early vs. late p = 3.64e-7, early vs. biphasic p = 2.24e-21, early vs. negative p = 4.42e-9, late vs. biphasic p = 2.32e-6, late vs. negative p = 5.69e-2, biphasic vs. negative p = 9.37e-2). f. Weighted temporal frequency preference versus gaze shift response latency, for all cells responsive to gratings. Running median ± SEM for all cells is overlaid. The color of each point indicates the cluster from gaze shift responses. (r = −0.468, p = 2.12e-16). g. Same as c for temporal frequency.

Extended Data Fig. 6 Temporal tuning of neurons can explain diverse responses to gaze shifts.

a. Responses to flashed sparse noise stimulus presented with an inter-stimulus interval (ISI; n = 3 mice; early=71, late=33, biphasic=9, negative=7). Left: mean ± SEM for each cluster; right: individual neuron responses overlaid with mean. b. Same as a for a continuously flashed sparse noise stimulus. c. Schematic of modeling approach. A scalar stimulus, presented continuously or with an inter-stimulus interval (ISI), is passed through a variable temporal kernel of either high, intermediate, or low temporal frequency (TF), and a non-linearity is used to generate a spiking output. d. High TF kernel. e. Intermediate TF kernel. f. Low TF kernel. g. Resulting response of model using high TF kernel to visual stimuli presented with an ISI. h. Same as g for intermediate TF kernel. i. Same as g for low TF kernel. j. Resulting response of model using high TF kernel to visual stimuli presented continuously. k. Same as j for intermediate TF kernel. l. Same as j for low TF kernel.

Extended Data Fig. 7 Additional characterization of marmoset saccade response types.

a. PCA of marmoset gaze shift PETHs for the 2 PCs with highest explained variance, colored by k-means clusters. b. Fraction of units in each saccade response cluster. c. Median ± SEM baseline firing rate of units in each cluster (n = 2 marmosets, n = 238 cells). d. Fraction of cells with maximal response to each presented spatial frequency.

Supplementary information

Supplementary Information

Supplementary Text.

Reporting Summary

Supplementary Video 1

A temporal sequence across the population following gaze shifts. Corresponds to Fig. 2d. Experimental data from a 3-s period of freely moving activity. Top left, eye camera video. Top right, estimated visual input, based on world camera video corrected for eye position22. Bottom, spike rasters for simultaneously recorded gaze shift-responsive units (n = 99), color-coded by gaze-shift cluster and ordered from short to long gaze-shift response latency along the y axis. Black arrows above the spike rasters indicate the time of gaze shifts. In the audio channel, individual spikes from 35 units are represented by notes mapped into pitch based on the temporal sorting, from short latency as low pitch to long latency as high pitch. Saccade times are represented as percussive notes.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Parker, P.R.L., Martins, D.M., Leonard, E.S.P. et al. A dynamic sequence of visual processing initiated by gaze shifts. Nat Neurosci 26, 2192–2202 (2023). https://doi.org/10.1038/s41593-023-01481-7

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1038/s41593-023-01481-7

Search

Quick links

Nature Briefing

Sign up for the Nature Briefing newsletter — what matters in science, free to your inbox daily.

Get the most important science stories of the day, free in your inbox. Sign up for Nature Briefing