Skip to main content

Thank you for visiting You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

Survey of spiking in the mouse visual system reveals functional hierarchy


The anatomy of the mammalian visual system, from the retina to the neocortex, is organized hierarchically1. However, direct observation of cellular-level functional interactions across this hierarchy is lacking due to the challenge of simultaneously recording activity across numerous regions. Here we describe a large, open dataset—part of the Allen Brain Observatory2—that surveys spiking from tens of thousands of units in six cortical and two thalamic regions in the brains of mice responding to a battery of visual stimuli. Using cross-correlation analysis, we reveal that the organization of inter-area functional connectivity during visual stimulation mirrors the anatomical hierarchy from the Allen Mouse Brain Connectivity Atlas3. We find that four classical hierarchical measures—response latency, receptive-field size, phase-locking to drifting gratings and response decay timescale—are all correlated with the hierarchy. Moreover, recordings obtained during a visual task reveal that the correlation between neural activity and behavioural choice also increases along the hierarchy. Our study provides a foundation for understanding coding and signal propagation across hierarchically organized cortical and thalamic visual areas.

Your institute does not have access to this article

Relevant articles

Open Access articles citing this article.

Access options

Buy article

Get time limited or full article access on ReadCube.


All prices are NET prices.

Fig. 1: A standardized pipeline for electrophysiology in the mouse visual system.
Fig. 2: Functional connectivity recapitulates the anatomical hierarchy.
Fig. 3: Four measures of hierarchical processing applied to the mouse visual system.
Fig. 4: Higher-order areas signal behaviourally relevant changes in image identity more strongly than lower-order areas.

Data availability

The data from all 58 passive viewing experiments used to generate main text Figs. 13 is available for download in Neurodata Without Borders (NWB) format via the AllenSDK. Example Jupyter Notebooks for accessing the data can be found at

The Neurodata Without Borders files are also available on the DANDI Archive (; and as an AWS public dataset (

The metrics table used to generate Fig. 4e–h (active behaviour experiments) is available in the GitHub repository for this manuscript (

Code availability

Code for the following purposes are available from these repositories: generating manuscript figures,; data pre-processing and unit metrics,; spike-sorting,; OPT post-processing,; calculating stimulus metrics,; data acquisition,,,

The following open-source software was used: NumPy81, SciPy82, IPython83, Matplotlib84, Pandas85, xarray86, scikit-learn87, VTK88, DeepLabCut79,89, statsmodels90, allenCCF70, tifffile (, Jupyter (, pynwb (


  1. Felleman, D. J. & Van Essen, D. C. Distributed hierarchical processing in the primate cerebral cortex. Cereb. Cortex 1, 1–47 (1991).

    CAS  PubMed  Google Scholar 

  2. de Vries, S. E. J. et al. A large-scale standardized physiological survey reveals functional organization of the mouse visual cortex. Nat. Neurosci. 23, 138–151 (2020).

    PubMed  Google Scholar 

  3. Harris, J. A. et al. Hierarchical organization of cortical and thalamic connectivity. Nature 575, 195–202 (2019).

    CAS  PubMed  ADS  PubMed Central  Google Scholar 

  4. Carandini, M. et al. Do we know what the early visual system does? J. Neurosci. 25, 10577–10597 (2005).

    CAS  PubMed  PubMed Central  Google Scholar 

  5. Olshausen, B. A. & Field, D. J. How close are we to understanding V1? Neural Comput. 17, 1665–1699 (2005).

    PubMed  MATH  Google Scholar 

  6. Jun, J. J. et al. Fully integrated silicon probes for high-density recording of neural activity. Nature 551, 232–236 (2017).

    CAS  PubMed  PubMed Central  ADS  Google Scholar 

  7. Hubel, D. H. & Wiesel, T. N. Receptive fields, binocular interaction and functional architecture in the cat’s visual cortex. J. Physiol. (Lond.) 160, 106–154 (1962).

    CAS  Google Scholar 

  8. Fukushima, K. Neocognitron: a self organizing neural network model for a mechanism of pattern recognition unaffected by shift in position. Biol. Cybern. 36, 193–202 (1980).

    CAS  PubMed  MATH  Google Scholar 

  9. Krizhevsky, A., Sutskever, I. & Hinton, G. E. ImageNet classification with deep convolutional neural networks. in Proc. 25th International Conference on Neural Information Processing Systems (eds Pereira, F. et al.) 1097–1105 (NeurIPS, 2012).

  10. Riesenhuber, M. & Poggio, T. Hierarchical models of object recognition in cortex. Nat. Neurosci. 2, 1019–1025 (1999).

    CAS  PubMed  Google Scholar 

  11. Bullier, J. Integrated model of visual processing. Brain Res. Rev. 36, 96–107 (2001).

    CAS  PubMed  Google Scholar 

  12. Chaudhuri, R., Knoblauch, K., Gariel, M.-A., Kennedy, H. & Wang, X.-J. A large-scale circuit mechanism for hierarchical dynamical processing in the primate cortex. Neuron 88, 419–431 (2015).

    CAS  PubMed  PubMed Central  Google Scholar 

  13. Murray, J. D. et al. A hierarchy of intrinsic timescales across primate cortex. Nat. Neurosci. 17, 1661–1663 (2014).

    CAS  PubMed  PubMed Central  Google Scholar 

  14. Rockland, K. S. & Pandya, D. N. Laminar origins and terminations of cortical connections of the occipital lobe in the rhesus monkey. Brain Res. 179, 3–20 (1979).

    CAS  PubMed  Google Scholar 

  15. Schmolesky, M. T. et al. Signal timing across the macaque visual system. J. Neurophysiol. 79, 3272–3278 (1998).

    CAS  PubMed  Google Scholar 

  16. Yamins, D. L. K. & DiCarlo, J. J. Using goal-driven deep learning models to understand sensory cortex. Nat. Neurosci. 19, 356–365 (2016).

    CAS  PubMed  Google Scholar 

  17. Gămănuţ, R. et al. The mouse cortical connectome, characterized by an ultra-dense cortical graph, maintains specificity by distinct connectivity profiles. Neuron 97, 698–715.e10 (2018).

    PubMed  PubMed Central  Google Scholar 

  18. Glickfeld, L. L. & Olsen, S. R. Higher-order areas of the mouse visual cortex. Annu. Rev. Vis. Sci. 3, 251–273 (2017).

    PubMed  Google Scholar 

  19. Wang, Q., Sporns, O. & Burkhalter, A. Network analysis of corticocortical connections reveals ventral and dorsal processing streams in mouse visual cortex. J. Neurosci. 32, 4386–4399 (2012).

    CAS  PubMed  PubMed Central  Google Scholar 

  20. Wang, Q. & Burkhalter, A. Area map of mouse visual cortex. J. Comp. Neurol. 502, 339–357 (2007).

    PubMed  Google Scholar 

  21. Han, Y. et al. The logic of single-cell projections from visual cortex. Nature 556, 51–56 (2018).

    CAS  PubMed  PubMed Central  ADS  Google Scholar 

  22. Allen, W. E. et al. Thirst regulates motivated behavior through modulation of brainwide neural population dynamics. Science 364, 253 (2019).

    PubMed  PubMed Central  ADS  Google Scholar 

  23. Steinmetz, N. A., Zatka-Haas, P., Carandini, M. & Harris, K. D. Distributed coding of choice, action and engagement across the mouse brain. Nature 576, 266–273 (2019).

    CAS  PubMed  PubMed Central  Google Scholar 

  24. Stringer, C. et al. Spontaneous behaviors drive multidimensional, brainwide activity. Science 364, 255 (2019).

    PubMed  PubMed Central  ADS  Google Scholar 

  25. Siegle, J. H. et al. Reconciling functional differences in populations of neurons recorded with two-photon imaging and electrophysiology. Preprint at (2020).

  26. Pachitariu, M., Steinmetz, N. A., Kadir, S. N., Carandini, M. & Harris, K. D. Fast and accurate spike sorting of high-channel count probes with KiloSort. In Advances in Neural Information Processing Systems 29 (eds Lee, D. et al.) 4448–4456 (NeurIPS, 2016).

  27. Wang, Q. et al. The Allen Mouse Brain Common Coordinate Framework: a 3D reference atlas. Cell 181, 936–953.e20 (2020).

    CAS  PubMed  PubMed Central  Google Scholar 

  28. Jia, X., Tanabe, S. & Kohn, A. γ and the coordination of spiking activity in early visual cortex. Neuron 77, 762–774 (2013).

    CAS  PubMed  PubMed Central  Google Scholar 

  29. Smith, M. A. & Kohn, A. Spatial and temporal scales of neuronal correlation in primary visual cortex. J. Neurosci. 28, 12591–12603 (2008).

    CAS  PubMed  PubMed Central  Google Scholar 

  30. Zandvakili, A. & Kohn, A. Coordinated neuronal activity enhances corticocortical communication. Neuron 87, 827–839 (2015).

    CAS  PubMed  PubMed Central  Google Scholar 

  31. Freeman, J., Ziemba, C. M., Heeger, D. J., Simoncelli, E. P. & Movshon, J. A. A functional and perceptual signature of the second visual area in primates. Nat. Neurosci. 16, 974–981 (2013).

    CAS  PubMed  PubMed Central  Google Scholar 

  32. Hubel, D. Eye, Brain, and Vision Vol. 22 (Scientific American Press, 1988).

  33. Lennie, P. Single units and visual cortical organization. Perception 27, 889–935 (1998).

    CAS  PubMed  Google Scholar 

  34. Matteucci, G., Bellacosa Marotti, R., Riggi, M., Rosselli, F. B. & Zoccolan, D. Nonlinear processing of shape information in rat lateral extrastriate cortex. J. Neurosci. 39, 1649–1670 (2019).

    CAS  PubMed  PubMed Central  Google Scholar 

  35. Wypych, M. et al. Standardized F1: a consistent measure of strength of modulation of visual responses to sine-wave drifting gratings. Vision Res. 72, 14–33 (2012).

    CAS  PubMed  Google Scholar 

  36. Runyan, C. A., Piasini, E., Panzeri, S. & Harvey, C. D. Distinct timescales of population coding across cortex. Nature 548, 92–96 (2017).

    CAS  PubMed  PubMed Central  ADS  Google Scholar 

  37. Garrett, M. et al. Experience shapes activity dynamics and stimulus coding of VIP inhibitory cells. eLife 9, e50340 (2020).

    CAS  PubMed  PubMed Central  Google Scholar 

  38. Groblewski, P. A. et al. Characterization of learning, motivation, and visual perception in five transgenic mouse lines expressing GCaMP in distinct cell populations. Front. Behav. Neurosci. 14, 104 (2020).

    PubMed  PubMed Central  Google Scholar 

  39. Grimm, S., Escera, C., Slabu, L. & Costa-Faidella, J. Electrophysiological evidence for the hierarchical organization of auditory change detection in the human brain. Psychophysiology 48, 377–384 (2011).

    PubMed  Google Scholar 

  40. Dürschmid, S. et al. Hierarchy of prediction errors for auditory events in human temporal and frontal cortex. Proc. Natl Acad. Sci. USA 113, 6755–6760 (2016).

    PubMed  PubMed Central  Google Scholar 

  41. Vinken, K., Vogels, R. & Op de Beeck, H. Recent visual experience shapes visual processing in rats through stimulus-specific adaptation and response enhancement. Curr. Biol. 27, 914–919 (2017).

    CAS  PubMed  Google Scholar 

  42. Koch, C. & Reid, R. C. Observatories of the mind. Nature 483, 397–398 (2012).

    CAS  PubMed  ADS  Google Scholar 

  43. Issa, E. B., Cadieu, C. F. & DiCarlo, J. J. Neural dynamics at successive stages of the ventral visual stream are consistent with hierarchical error signals. eLife 7, e42870 (2018).

    PubMed  PubMed Central  Google Scholar 

  44. Keller, G. B. & Mrsic-Flogel, T. D. Predictive processing: a canonical cortical computation. Neuron 100, 424–435 (2018).

    CAS  PubMed  PubMed Central  Google Scholar 

  45. Zhuang, J. et al. An extended retinotopic map of mouse cortex. eLife 6, e18372 (2017).

    PubMed  PubMed Central  Google Scholar 

  46. Maunsell, J. H. R. Functional visual streams. Curr. Opin. Neurobiol. 2, 506–510 (1992).

    CAS  PubMed  Google Scholar 

  47. Ungerleider, L. & Mishkin, M. in Analysis of Visual Behavior (eds Ingle, D. J., Goodale, M. A. & Mansfield, R. J. W.) 549–586 (MIT Press, 1982).

  48. D’Souza, R. D. et al. Canonical and noncanonical features of the mouse visual cortical hierarchy. Preprint at (2020).

  49. Murakami, T., Matsui, T. & Ohki, K. Functional segregation and development of mouse higher visual areas. J. Neurosci. 37, 9424–9437 (2017).

    CAS  PubMed  PubMed Central  Google Scholar 

  50. Smith, I. T., Townsend, L. B., Huh, R., Zhu, H. & Smith, S. L. Stream-dependent development of higher visual cortical areas. Nat. Neurosci. 20, 200–208 (2017).

    CAS  PubMed  PubMed Central  Google Scholar 

  51. van Hateren, J. H. & van der Schaaf, A. Independent component filters of natural images compared with simple cells in primary visual cortex. Proc. R. Soc. Lond. B 265, 359–366 (1998).

    Google Scholar 

  52. Olmos, A. & Kingdom, F. A. A. A biologically inspired algorithm for the recovery of shading and reflectance images. Perception 33, 1463–1473 (2004).

    PubMed  Google Scholar 

  53. Lima, S. Q., Hromádka, T., Znamenskiy, P. & Zador, A. M. PINP: a new method of tagging neuronal populations for identification during in vivo electrophysiological recording. PLoS ONE 4, e6099 (2009).

    PubMed  PubMed Central  ADS  Google Scholar 

  54. Madisen, L. et al. A toolbox of Cre-dependent optogenetic transgenic mice for light-induced activation and silencing. Nat. Neurosci. 15, 793–802 (2012).

    CAS  PubMed  PubMed Central  Google Scholar 

  55. Zhang, F., Wang, L.-P., Boyden, E. S. & Deisseroth, K. Channelrhodopsin-2 and optical control of excitable cells. Nat. Methods 3, 785–792 (2006).

    CAS  PubMed  Google Scholar 

  56. Goldey, G. J. et al. Removable cranial windows for long-term imaging in awake mice. Nat. Protoc. 9, 2515–2538 (2014).

    CAS  PubMed  PubMed Central  Google Scholar 

  57. Juavinett, A. L., Nauhaus, I., Garrett, M. E., Zhuang, J. & Callaway, E. M. Automated identification of mouse visual areas with intrinsic signal imaging. Nat. Protoc. 12, 32–43 (2017).

    CAS  PubMed  Google Scholar 

  58. Kalatsky, V. A. & Stryker, M. P. New paradigm for optical imaging: temporally encoded maps of intrinsic signal. Neuron 38, 529–545 (2003).

    CAS  PubMed  Google Scholar 

  59. Garrett, M. E., Nauhaus, I., Marshel, J. H. & Callaway, E. M. Topography and areal organization of mouse visual cortex. J. Neurosci. 34, 12587–12600 (2014).

    CAS  PubMed  PubMed Central  Google Scholar 

  60. Fiáth, R. et al. Slow insertion of silicon probes improves the quality of acute neuronal recordings. Sci. Rep. 9, 111 (2019).

    PubMed  PubMed Central  ADS  Google Scholar 

  61. Siegle, J. H. et al. Open Ephys: an open-source, plugin-based platform for multichannel electrophysiology. J. Neural Eng. 14, 045003 (2017).

    PubMed  ADS  Google Scholar 

  62. Peirce, J. W. PsychoPy—Psychophysics software in Python. J. Neurosci. Methods 162, 8–13 (2007).

    PubMed  PubMed Central  Google Scholar 

  63. Martin, D., Fowlkes, C., Tal, D. & Malik, J. A database of human segmented natural images and its application to evaluating segmentation algorithms and measuring ecological statistics. In Proc. Eighth IEEE International Conference on Computational Vision 416–423 (IEEE, 2001).

  64. Welles, O. Touch of Evil (Universal - International, 1958).

  65. Renier, N. et al. iDISCO: a simple, rapid method to immunolabel large tissue samples for volume imaging. Cell 159, 896–910 (2014).

    CAS  PubMed  Google Scholar 

  66. Nguyen, D. et al. Optical projection tomography for rapid whole mouse brain imaging. Biomed. Opt. Express 8, 5637–5650 (2017).

    PubMed  PubMed Central  Google Scholar 

  67. Sharpe, J. et al. Optical projection tomography as a tool for 3D microscopy and gene expression studies. Science 296, 541–545 (2002).

    CAS  PubMed  ADS  Google Scholar 

  68. Wong, M. D., Dazai, J., Walls, J. R., Gale, N. W. & Henkelman, R. M. Design and implementation of a custom built optical projection tomography system. PLoS ONE 8, e73491 (2013).

    CAS  PubMed  PubMed Central  ADS  Google Scholar 

  69. Edelstein, A. D. et al. Advanced methods of microscope control using μManager software. J. Biol. Methods 1, 10 (2014).

    Google Scholar 

  70. Shamash, P., Carandini, M., Harris, K. D. & Steinmetz, N. A. A tool for analyzing electrode tracks from slice histology. Preprint at (2018).

  71. Jia, X. et al. High-density extracellular probes reveal dendritic backpropagation and facilitate neuron classification. J. Neurophysiol. 121, 1831–1847 (2019).

    PubMed  Google Scholar 

  72. Hill, D. N., Mehta, S. B. & Kleinfeld, D. Quality metrics to accompany spike sorting of extracellular signals. J. Neurosci. 31, 8699–8705 (2011).

    CAS  PubMed  PubMed Central  Google Scholar 

  73. Suner, S., Fellows, M. R., Vargas-Irwin, C., Nakata, G. K. & Donoghue, J. P. Reliability of signals from a chronically implanted, silicon-based electrode array in non-human primate primary motor cortex. IEEE Trans. Neural Syst. Rehabil. Eng. 13, 524–541 (2005).

    PubMed  Google Scholar 

  74. Schmitzer-Torbert, N., Jackson, J., Henze, D., Harris, K. & Redish, A. D. Quantitative measures of cluster quality for use in extracellular recordings. Neuroscience 131, 1–11 (2005).

    CAS  PubMed  Google Scholar 

  75. Chung, J. E. et al. A fully automated approach to spike sorting. Neuron 95, 1381–1394.e6 (2017).

    CAS  PubMed  PubMed Central  Google Scholar 

  76. Gerstein, G. L. & Perkel, D. H. Mutual temporal relationships among neural spike trains. Biophys. J. 12, 453–473 (1972).

    CAS  PubMed  PubMed Central  ADS  Google Scholar 

  77. Harrison, M. T. & Geman, S. A rate and history-preserving resampling algorithm for neural spike trains. Neural Comput. 21, 1244–1258 (2009).

    MathSciNet  PubMed  PubMed Central  MATH  Google Scholar 

  78. Matteucci, G., Bellacosa Marotti, R., Riggi, M., Rosselli, F. B. & Zoccolan, D. Nonlinear processing of shape information in rat lateral extrastriate cortex. J. Neurosci. 39, 1649–1670 (2019).

    CAS  PubMed  PubMed Central  Google Scholar 

  79. Mathis, A. et al. DeepLabCut: markerless pose estimation of user-defined body parts with deep learning. Nat. Neurosci. 21, 1281–1289 (2018).

    CAS  PubMed  Google Scholar 

  80. Halir, R. & Flusser, J. Numerically stable direct least squares fitting of ellipses. In Proc. Sixth International Conference in Central Europe on Computer Graphics and Visualization (WSCG, 1998).

  81. van der Walt, S., Colbert, S. C. & Varoquaux, G. The NumPy array: a structure for efficient numerical computation. Comput. Sci. Eng. 13, 22–30 (2011).

    Google Scholar 

  82. Virtanen, P. et al. SciPy 1.0: fundamental algorithms for scientific computing in Python. Nat Methods 17, 261–272 (2020).

    CAS  PubMed  PubMed Central  Google Scholar 

  83. Pérez, F. & Granger, B. E. IPython: a system for interactive scientific computing. Comput. Sci. Eng. 9, 21–29 (2007).

    Google Scholar 

  84. Hunter, J. D. Matplotlib: a 2D graphics environment. Comput. Sci. Eng. 9, 90–95 (2007).

    Google Scholar 

  85. McKinney, W. Data structures for statistical computing in Python. In Proc. 9th Python in Science Conference (eds van der Walt, S. & Millman, J.) 51–56 (SciPy, 2010).

  86. Hoyer, S. & Hamman, J. xarray: N–D labeled arrays and datasets in Python. J. Open Res. Softw. 5, 10 (2017).

    Google Scholar 

  87. Pedregosa, F. et al. Scikit-learn: machine learning in Python. J. Mach. Learn. Res. 12, 2825–2830 (2011).

    MathSciNet  MATH  Google Scholar 

  88. Schroeder, W., Martin, K. & Lorensen, B. The Visualization Toolkit 4th edn (Kitware, 2006).

  89. Nath, T. et al. Using DeepLabCut for 3D markerless pose estimation across species and behaviors. Nat. Protoc. 14, 2152–2176 (2019).

    CAS  PubMed  Google Scholar 

  90. Seabold, S. & Perktold, J. Statsmodels: Econometric and statistical modeling with Python. In Proc. 9th Python in Science Conference (eds van der Walt, S. & Millman, J.) 92–96 (SciPy, 2010).

  91. Zeraati, R., Engel, T. A. & Levina, A. Estimation of autocorrelation timescales with approximate Bayesian computations. Preprint at (2020).

  92. Morgenstern, N. A., Bourg, J. & Petreanu, L. Multilaminar networks of cortical neurons integrate common inputs from sensory thalamus. Nat. Neurosci. 19, 1034–1040 (2016).

    CAS  PubMed  Google Scholar 

Download references


We thank the Allen Institute founder, Paul G. Allen, for his vision, encouragement and support. Primary funding for this project was provided by the Allen Institute. We thank the Falconwood Foundation and the Tiny Blue Dot Foundation for additional funding. We thank the Mindscope Scientific Advisory Committee for feedback on this project. We thank J. Zhuang and Q. Wang for feedback on the manuscript and A. Zandvakili for discussions.

Author information

Authors and Affiliations



C.K., S.R.O., J.H.S., X.J., S.G., C.B., S.M., D.J.D., S.E.J.d.V., M.A.B. and R.C.R. developed the concepts of the project. C.K., S.R.O., J.H.S., P.A.G., R.C.R., C.F., S.M., H.Z. and S.D. supervised the project. J.H.S., X.J., S.D., G.H., T.K.R., S.G., C.B., S.R.O., J.L., N.G., A.A., A.B., Y.N.B., M.A.B., L.C., N.C., S.C., A.C., T.C.C., S.E.J.d.V., D.J.D., R.D., D.F., E.C.G., R. Howard, B.H., R.I., I.K., J.K., S.L., J.A.L., P.L., J.H.L., A.L., Y.L., F.L., K.M., L.N., T.N., P.R.N., G.K.O., M.O., J.P., M. Reding, D.R., M. Robertson, S.S., C.N., C.S., D.M., T.M., K.T., M.S., D.S., J.S., D.W., A.W., R.A., D.B., M.C., E.L., K.R., K. North, B.S., E.J., K.J., J.M., K. Ngo, M.G., D.O., J.A.H. and J.D.W. undertook investigation and validation, developed the methodology and performed formal analyses. J.H.S., X.J., N.G., K.D., S.G., C.B., S.E.J.d.V., M.P., D.O., J.K., N.C., H.C., D.R., D.W., J.G., M.A.B., P.L. and G.H. developed software. J.H.S., N.G., X.J., K.D., D.F., J.G., R. Hythen, W.W. and R.Y. curated the data. C.T., S.N., L.C., L.E., N.H. and J.W.P. were involved in project administration. J.H.S., X.J., S.G., C.B., S.R.O., H.C., S.D., P.A.G. and D.S. performed data visualization. C.K., S.R.O., J.H.S. and X.J. wrote the original draft of the manuscript, with input and editing from S.M., H.C., C.B. and S.G. All authors reviewed the manuscript.

Corresponding authors

Correspondence to Joshua H. Siegle, Xiaoxuan Jia or Shawn R. Olsen.

Ethics declarations

Competing interests

The authors declare no competing interests.

Additional information

Peer review information Nature thanks Tatiana Engel and the other, anonymous, reviewer(s) for their contribution to the peer review of this work.

Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Extended data figures and tables

Extended Data Fig. 1 Pipeline procedures.

a–f, Summary of procedures involved in each step of the pipeline. g, Rig for parallel recording from six Neuropixels probes. Scale bar, 10 cm. h, Example retinotopic map used for targeting probes to six cortical visual areas. Scale bar, 1 mm. i, Image of Neuropixels probes during an experiment, with area boundaries from h overlaid in orange. Probe tips are marked with white dots. Scale bar, 1 mm. j, Box plot of the number of units recorded per area per experiment, after filtering based on ISI violations (<0.5), amplitude cutoff (<0.1) and presence ratio (>0.95) (see Methods and Extended Data Fig. 4 for quality metric definitions and distributions). Box plot edges represent upper and lower quartiles; centre line represents the median; whiskers represent 5th to 95th percentile range; open circles represent any data points beyond the edge of the whiskers. k, Histogram of the number of simultaneously recorded cortical and thalamic visual areas per experiment (n = 58 experiments).

Extended Data Fig. 2 Pipeline quality control.

af, Major quality control metrics for each pipeline step, with examples of passing and failing experiments. The number of mice failing quality control at each stage is shown on the right.

Extended Data Fig. 3 Data processing steps.

a, Data from the Neuropixels probe is split at the hardware level into two separate streams for each electrode: spike band and LFP band. b, The spike band passes through offset subtraction, median subtraction and whitening steps before sorting. The resulting data can be viewed as an image, with dimensions of time and channels, and colours corresponding to voltage levels. c, The LFP data are down sampled to 1.25 kHz and 40 μm channel spacing before packaging. d, We use the Kilosort2 to match spike templates to the raw data. The output of this algorithm can be used to reconstruct the original data using information about template shape, times and amplitudes. e, The spike and LFP data are packaged into Neurodata Without Borders (NWB) 2.0 files. f, The outputs of Kilosort2 are passed through a semi-automated quality control procedure to remove units with artefactual waveforms. Only units with obvious spike-like characteristics are used for further analysis.

Extended Data Fig. 4 Unit quality metrics.

a, Density functions for twelve quality control metrics, plotted for units in cortex, hippocampus, thalamus and midbrain, aggregated across experiments. Default AllenSDK thresholds are shown as dotted lines. b, Unit selection flowchart for generating manuscript figures. Note that we do not use the default AllenSDK filters in this work, but instead use a receptive field P value of 0.01 as the primary metric for selecting units for analysis. CCFv3 structure labels used for region identification are as follows: cortex (VISp, VISl, VISrl, VISam, VISpm, VISal, VISmma, VISmmp, VISli, VIS), thalamus (LGd, LD, LP, VPM, TH, MGm, MGv, MGd, PO, LGv, VL, VPL, POL, Eth, PoT, PP, PIL, IntG, IGL, SGN, VPL, PF, RT), hippocampal formation (CA1, CA2, CA3, DG, SUB, POST, PRE, ProS, HPF), midbrain (MB, SCig, SCiw, SCsg, SCzo, SCop, PPT, APN, NOT, MRN, OP, LT, RPF), other/nonregistered (CP, ZI, grey).

Extended Data Fig. 5 Aligning units with the Common Coordinate Framework (CCFv3).

a, After each experiment, the brain is removed and cleared using a variant of the iDISCO method. b, The cleared brain is imaged at 400 rotational angles using a custom-built optical projection tomography microscope. c, We generated an isotropic 3D volume from rotational images using a computational tomography algorithm. d, Key points from the CCFv3 template brain are manually identified in each individual brain. e, Points along each fluorescently labelled probe track are manually identified in the volume. Using the key points from d, we define a warping function to translate points along the probe axis into the Common Coordinate Framework. f, We then align the regional boundaries to boundaries in the physiological data, primarily the decrease in unit density at the border between the cortex and hippocampus, and between the hippocampus and thalamus. The shaded area represents unit density on each recording site, and pink dots represent low-frequency LFP power (<10 Hz) along the probe axis. g, Finally, units in the database are mapped to a 3D location in the CCFv3 and are assigned a structure label. Units in cortex are also assigned a relative depth (0, surface; 1, white matter) and a layer label (L1, L2/3, L4, L5 or L6), on the basis of the annotation of the CCFv3 template volume (10-μm resolution).

Extended Data Fig. 6 Details of the visual stimulus set and receptive field mapping procedure.

a, Example frames from each type of stimulus. Green arrows indicate direction of motion. The natural scene image is shown illustrative purposes. The natural scene images shown to the mice are from refs. 51 and 52. b, Timing diagram for visual stimulus set #1, known as ‘Brain Observatory 1.1’. c, Timing diagram for visual stimulus set #2, known as ‘Functional Connectivity’. d, Receptive field mapping used 20° diameter drifting gratings flashed for 250 ms in each of 81 randomized locations on the screen. A spike raster for one unit shows the timing of spikes on each of 45 trials with the stimulus at a particular location. Collapsing over trials yields a peristimulus time histogram for each location. Collapsing over time yields a spike count for each spatial bin. A matrix of spike counts represents the receptive field for this unit. e, To calculate receptive field properties, the receptive field is first smoothed with a Gaussian filter, and all pixels above a threshold value are selected. The centre of mass of the above-threshold pixels indicates the receptive field location, while the total number of above-threshold pixels indicates the area. These processing steps are shown for 25 receptive fields randomly chosen from one experiment.

Extended Data Fig. 7 Functional connections between visual cortical areas.

a, Peak offset distributions aggregated across 25 mice for each area combination, during drifting gratings presentation. The total number of pairs (n) is labelled in each sub-panel. Dashed black line indicates zero time lag. Dashed red line indicates the median of the distribution. b, Fraction of within- and between-area unit pairs exhibiting sharp peaks, out of all simultaneously recorded pairs. c, Combined median of peak offsets across mice (averaged across mice; n = 25 mice in total) for each pair of cortical areas. d, Correlation between the median peak offset and the difference in hierarchy scores among 21 pairs (lower triangle and diagonal of the matrix). e, Relationship between average 3D Euclidean distance between units simultaneously recorded in each pair of areas (following registration to the CCFv3) and their hierarchy score difference. rP, Pearson correlation coefficient; rS, Spearman’s rank correlation coefficient. f, Average number of sharp peak connections per mouse for jitter-corrected CCGs calculated during spontaneous activity (30 min grey screen period). Pixels masked with grey indicate no sharp peaks were detected. g, Average directionality score across mice during spontaneous activity.

Extended Data Fig. 8 Simulation of functional connectivity profiles for different network structures.

a, Directionality score (DS) and total hierarchy score calculated from actual data. Left, an example distribution of peak offsets between V1 (source) and LM (target); middle, DS matrix for all area combinations; right, mean DS for each source area to all target areas, which gradually decreases along the hierarchy. The maximum difference of the mean DS across areas represents the total hierarchy score for the real network. bf, Simulations based on different hypothetical network structures. Because the standard deviation of peak offset distribution in our measured CCG time lag distribution is 3.7 ± 0.2 ms and the median CCG time lag of neighbouring areas is 1.1 ± 0.4 ms, we simulated Gaussian distributions of the model peak offsets with σ = 4 and μ = 1 for neighbouring hierarchical levels (μ = \({L}_{i}-{L}_{j}\) between hierarchical levels i and j). See Methods for additional details of this simulation. b, A fully recurrent network where all nodes (areas) are at the same hierarchical level and have unbiased reciprocal connections (μ = 0). c, A two-level, one-to-all network that models parallel feedforward projections from V1, with all other areas recurrently connected with one another in an unbiased way. d, A three-level network, assuming V1 at the lowest level, RL, LM and AL at the second level, AM and PM at the top level. e, A six-level hierarchical network with each area at a distinct hierarchical level. Network parameters were constrained by real data (σ = 4 and μ = 1 for neighbouring hierarchical levels, and μ = \({L}_{i}-{L}_{j}\) between any hierarchical levels i and j). f, A six-level hierarchical network with a narrow distribution of peak offsets (σ = 1) that simulates a paucity of feedback connections.

Extended Data Fig. 9 Statistics and additional analysis of hierarchy measures.

a, P values for pairwise comparisons of time to first spike between areas (two-sided Wilcoxon rank–sum test with Benjamini-Hochberg false discovery rate correction). b, Comparison between time-to-first-spike measured in response to the onset of the flash stimulus (‘flash’) versus during the inter-stimulus interval which corresponds to spontaneous firing (‘spontaneous’). The colour scheme is the same as in Fig. 4; error bars represent mean ± 95% confidence intervals; n = 15,713 units from 58 mice. c, Relationship between time-to-first spike and mean firing rate for a given area, either in response to the flash stimulus, or during the inter-trial interval (‘spontaneous’). d, P values for pairwise comparisons of receptive field size between areas. Colour scale is the same as in a. e, P values for pairwise comparisons of modulation index between area. Colour scale is the same as in a. f, Distribution of intrinsic timescale across units in each of 8 areas. g, Correlation between mean intrinsic timescale and anatomical hierarchy score. The absence of a significant correlation is inconsistent with the findings from ref. 13, in which it was shown that intrinsic timescale increases with hierarchical level in primates. This discrepancy may stem from differences between mouse and primate neocortex, or the fact that the areas we have recorded do not span the full range of the mouse cortical hierarchy. In addition, it is known that standard exponential fitting procedures produce biased and unreliable timescale estimates, which may account for the null result we observed91. h, P-values for pairwise comparisons of response decay timescales between areas. Colour scale is the same as in a. i, Distribution of overall firing rates for all units in each area. j, Correlation between mean firing rate and anatomical hierarchy score. k, Relationship between change modulation index and anatomical hierarchy score, grouped by hit and miss trials. l, Relationship between pre-change response and anatomical hierarchy score, grouped by active and passive trials. m, Relationship between change response and anatomical hierarchy score, grouped by active and passive trials. n, Relationship between baseline firing rate and anatomical hierarchy score, grouped by active and passive trials. o, Decoder accuracy as a function of number of neurons used for decoding, averaged across all brain regions and behaviour sessions. p, Decoder accuracy for each brain region (mean ± s.e.m., averaged across sessions) is not correlated with the anatomical hierarchy score. rP, Pearson correlation coefficient; rS, Spearman’s rank correlation coefficient.

Extended Data Fig. 10 Layer-wise analysis.

a, Distribution of unit depths by area. 0 = surface, 1 = white matter. Normalized depth is measured along lines normal to the cortical surface (‘cortical streamlines’), rather than distance along the probe. b, Time-to-first-spike, receptive field area, modulation index, and response decay timescale analysed separately for each cortical layer. Colours are the same as those used in Fig. 4. Error bars represent mean ± 95% bootstrap confidence intervals. On average, in comparison to deep layers (5 and 6), superficial layers (2/3 and 4) had an earlier time to first spike (2.59 ms difference, P = 2.7 × 10−19, two-sided Wilcoxon rank-sum test), smaller receptive fields (109° difference, P = 1.1 × 10−33), higher modulation index (0.09 MI difference, P = 5.0 × 10−23), and faster response decay timescale (6.6 ms difference, P = 3.8 × 10−33). The presence of slightly earlier spikes in L2/3 than L4 of V1 is probably due to the existence of direct connections from LGN to L2/3 of this area92. rP, Pearson correlation coefficient; rS, Spearman’s rank correlation coefficient. c, Average number of sharp peak pairs for each area and layer combination. Units in each area are bi-partitioned into superficial (layers 2–4) and deep layers (layers 5–6). d, Directionality score (averaged across mice) as an indicator of feedforward and feedback asymmetry. Areas ordered by hierarchy and layers arranged from superficial to deep. e, Directionality score based on average within-layer and between-layer distributions in d; superficial layers tend to drive deep layers within a cortical area.

Supplementary information

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Siegle, J.H., Jia, X., Durand, S. et al. Survey of spiking in the mouse visual system reveals functional hierarchy. Nature 592, 86–92 (2021).

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI:

Further reading


By submitting a comment you agree to abide by our Terms and Community Guidelines. If you find something abusive or that does not comply with our terms or guidelines please flag it as inappropriate.


Quick links

Nature Briefing

Sign up for the Nature Briefing newsletter — what matters in science, free to your inbox daily.

Get the most important science stories of the day, free in your inbox. Sign up for Nature Briefing