Abstract
To understand how the brain processes sensory information to guide behavior, we must know how stimulus representations are transformed throughout the visual cortex. Here we report an open, large-scale physiological survey of activity in the awake mouse visual cortex: the Allen Brain Observatory Visual Coding dataset. This publicly available dataset includes the cortical activity of nearly 60,000 neurons from six visual areas, four layers, and 12 transgenic mouse lines in a total of 243 adult mice, in response to a systematic set of visual stimuli. We classify neurons on the basis of joint reliabilities to multiple stimuli and validate this functional classification with models of visual responses. While most classes are characterized by responses to specific subsets of the stimuli, the largest class is not reliably responsive to any of the stimuli and becomes progressively larger in higher visual areas. These classes reveal a functional organization wherein putative dorsal areas show specialization for visual motion signals.
This is a preview of subscription content, access via your institution
Relevant articles
Open Access articles citing this article.
-
Activity in primate visual cortex is minimally driven by spontaneous movements
Nature Neuroscience Open Access 12 October 2023
-
DeepSlice: rapid fully automatic registration of mouse brain imaging to a volumetric atlas
Nature Communications Open Access 21 September 2023
-
Responses of pyramidal cell somata and apical dendrites in mouse visual cortex over multiple days
Scientific Data Open Access 17 May 2023
Access options
Access Nature and 54 other Nature Portfolio journals
Get Nature+, our best-value online-access subscription
$29.99 / 30 days
cancel any time
Subscribe to this journal
Receive 12 print issues and online access
$209.00 per year
only $17.42 per issue
Rent or buy this article
Prices vary by article type
from$1.95
to$39.95
Prices may be subject to local taxes which are calculated during checkout







Data availability
This is an openly available dataset, accessible via a dedicated web portal (http://observatory.brain-map.org/visualcoding), and a Python-based API, the AllenSDK (http://alleninstitute.github.io/AllenSDK).
Code availability
Code for analyses presented in this paper is available at https://github.com/alleninstitute/visual_coding_2p_analysis.
References
Hubel, D. & Wiesel, T. Receptive fields of single neurones in the cat’s striate cortex. J. Physiol. 148, 574–591 (1959).
Hubel, D. H. & Wiesel, T. N. Receptive fields, binocular interaction and functional architecture in the cat’s visual cortex. J. Physiol. 160, 106–154 (1962).
Felleman, D. J. & Van Essen, D. C. Distributed hierarchical processing in the primate cerebral cortex. Cereb. Cortex 1, 1–47 (1991).
DiCarlo, J. J., Zoccolan, D. & Rust, N. C. How does the brain solve visual object recognition? Neuron 73, 415–434 (2012).
Olshausen, B. & Field, D. What is the other 85% of V1 doing? in 23 Problems in Systems Neuroscience (eds. van Hemmen, J. & Sejnowski, T.) 182–211 (Oxford University Press, 2006).
Masland, R. H. & Martin, P. R. The unsolved mystery of vision. Curr. Biol. 17, R577–R582 (2007).
Carandini, M. et al. Do we know what the early visual system does? J. Neurosci. 25, 10577–10597 (2005).
Andermann, M. L., Kerlin, A. M., Roumis, D. K., Glickfeld, L. L. & Reid, R. C. Functional specialization of mouse higher visual cortical areas. Neuron 72, 1025–1039 (2011).
Marshel, J. H., Garrett, M. E., Nauhaus, I. & Callaway, E. M. Functional specialization of seven mouse visual cortical areas. Neuron 72, 1040–1054 (2011).
Fournier, J., Monier, C., Pananceau, M. & Frégnac, Y. Adaptation of the simple or complex nature of V1 receptive fields to visual statistics. Nat. Neurosci. 14, 1053–1060 (2011).
David, S., Vinje, W. & Gallant, J. L. Natural stimulus statistics alter the receptive field structure of V1 neurons. J. Neurosci. 24, 6991–7006 (2004).
Talebi, V. & Baker, C. L. Natural versus synthetic stimuli for estimating receptive field models: a comparison of predictive robustness. J. Neurosci. 32, 1560–1576 (2012).
Yeh, C.-I., Xing, D., Williams, P. & Shapley, R. Stimulus ensemble and cortical layer determine V1 spatial receptive fields. Proc. Natl Acad. Sci. USA 106, 14652–14657 (2009).
Sharpee, T. O. et al. Adaptive filtering enhances information transmission in visual cortex. Nature 439, 936–942 (2006).
Felsen, G., Touryan, J., Han, F. & Dan, Y. Cortical sensitivity to visual features in natural scenes. PLoS Biol. 3, e342 (2005).
Averbeck, B. B., Latham, P. E. & Pouget, A. Neural correlations, population coding and computation. Nat. Rev. Neurosci. 7, 358–366 (2006).
Jewell, S., Hocking, T. D., Fearnhead, P. & Witten, D. Fast nonconvex deconvolution of calcium imaging data. Biostatistics https://doi.org/10.1093/biostatistics/kxy083 (2019).
Sun, W., Tan, Z., Mensh, B. D. & Ji, N. Thalamus provides layer 4 of primary visual cortex with orientation- and direction-tuned inputs. Nat. Neurosci. 19, 308–315 (2016).
Rolls, E. T. & Tovee, M. J. Sparseness of the neuronal representation of stimuli in the primate temporal visual cortex. J. Neurophysiol. 73, 713–726 (1995).
Vinje, W. E. & GallantJ. L. Sparse coding and decorrelation in primary visual cortex during natural vision. Science 287, 1273–1276 (2000).
Kohn, A., Coen-Cagli, R., Kanitscheider, I. & Pouget, A. Correlations and neuronal population information. Annu. Rev. Neurosci. 39, 237–256 (2016).
Dadarlat, M. C. & Stryker, M. P. Locomotion enhances neural encoding of visual stimuli in mouse V1. J. Neurosci. 37, 3764–3775 (2017).
Dipoppa, M. et al. Vision and locomotion shape the interactions between neuron types in mouse visual cortex. Neuron 98, 602–615 (2018).
Niell, C. M. & Stryker, M. P. Modulation of visual responses by behavioral state in mouse visual cortex. Neuron 65, 472–479 (2010).
Polack, P. O., Friedman, J. & Golshani, P. Cellular mechanisms of brain state-dependent gain modulation in visual cortex. Nat. Neurosci. 16, 1331–1339 (2013).
Saleem, A., Ayaz, A., Jeffery, K., Harris, K. & Carandini, M. Integration of visual motion and locomotion in mouse visual cortex. Nat. Neurosci. 16, 1864–1869 (2013).
Jun, J. J. et al. Fully integrated silicon probes for high-density recording of neural activity. Nature 551, 232 (2017).
Han, Y. et al. The logic of single-cell projections from visual cortex. Nature 556, 51–56 (2018).
Wang, Q., Sporns, O. & Burkhalter, A. Network analysis of corticocortical connections reveals ventral and dorsal processing streams in mouse visual cortex. J. Neurosci. 32, 4386–4399 (2012).
Olcese, U., Iurilli, G. & Medini, P. Cellular and synaptic architecture of multisensory integration in the mouse neocortex. Neuron 79, 579–593 (2013).
Pfeffer, C. K., Xue, M., He, M., Huang, Z. J. & Scanziani, M. Inhibition of inhibition in visual cortex: the logic of connections between molecularly distinct interneurons. Nat. Neurosci. 16, 1068–1076 (2013).
Fu, Y. et al. A cortical circuit for gain control by behavioral state. Cell 156, 1139–1152 (2014).
Adesnik, H., Bruns, W., Taniguchi, H., Huang, Z. J. & Scanziani, M. A neural circuit for spatial summation in visual cortex. Nature 490, 226–231 (2012).
McFarland, J. M., Cumming, B. G. & Butts, D. A. Variability and correlations in primary visual cortical neurons driven by fixational eye movements. J. Neurosci. 36, 6225–6241 (2016).
Vintch, B., Movshon, J. A. & Simoncelli, E. P. A convolutional subunit model for neuronal responses in macaque V1. J. Neurosci. 35, 14829–14841 (2015).
Dyballa, L., Hoseini, M. S., Dadarlat, M. C., Zucker, S. W. & Stryker, M. P. Flow stimuli reveal ecologically appropriate responses in mouse visual cortex. Proc. Natl Acad. Sci. USA 115, 11304–11309 (2018).
Yosinski, J., Clune, J., Nguyen, A., Fuchs, T. & Lipson, H. Understanding neural networks through deep visualization. Deep Learning Workshop, 31st International Conference on Machine Learning, Lille, France. Preprint at arXiv https://arxiv.org/abs/1506.06579 (2015).
Palagina, G., Meyer, J. F. & Smirnakis, S. M. Complex visual motion representation in mouse area V1. J. Neurosci. 37, 164–183 (2017).
Bieler, M. et al. Rate and temporal coding convey multisensory information in primary sensory cortices. eNeuro 4, ENEURO.0037-17.2017 (2017).
Ibrahim, L. A. et al. Cross-modality sharpening of visual cortical processing through layer-1-mediated inhibition and disinhibition. Neuron 89, 1031–1045 (2016).
Keller, G. B., Bonhoeffer, T. & Hübener, M. Sensorimotor mismatch signals in primary visual cortex of the behaving mouse. Neuron 74, 809–815 (2012).
Stringer, C. et al. Spontaneous behaviors drive multidimensional, brainwide activity. Science 364, eaav7893 (2019).
Musall, S., Kaufman, M. T., Juavinett, A. L., Gluf, S. & Churchland, A. K. Single-trial neural dynamics are dominated by richly varied movements. Nat. Neurosci. 22, 1677–1686 (2019).
Petersen, A., Simon, N. & Witten, D. SCALPEL: extracting neurons from calcium imaging data. Ann. Appl. Stat. 12, 2430–2456 (2018).
Sheintuch, L. et al. Tracking the same neurons across multiple days in Ca2+ imaging data. Cell Rep. 21, 1102–1115 (2017).
Ellis, R. J. et al. High-accuracy decoding of complex visual scenes from neuronal calcium responses. Preprint at bioRxiv https://doi.org/10.1101/271296 (2018).
Cai, L., Wu, B. & Ji, S. Neuronal activities in the mouse visual cortex predict patterns of sensory stimuli. Neuroinformatics 16, 473–488 (2018).
Zylberberg, J. Untuned but not irrelevant: a role for untuned neurons in sensory information coding. Preprint at bioRxiv https://doi.org/10.1101/134379 (2017).
Christensen, A. J. & Pillow, J. W. Running reduces firing but improves coding in rodent higher-order visual cortex. Preprint at bioRxiv https://doi.org/10.1101/214007 (2017).
Sweeney, Y. & Clopath, C. Population coupling predicts the plasticity of stimulus responses in cortical circuits. Preprint at bioRxiv https://doi.org/10.1101/265041 (2018).
Madisen, L. et al. A robust and high-throughput Cre reporting and characterization system for the whole mouse brain. Nat. Neurosci. 13, 133–140 (2010).
Madisen, L. et al. Transgenic mice for intersectional targeting of neural sensors and effectors with high specificity and performance. Neuron 85, 942–958 (2015).
Daigle, T. L. et al. A suite of transgenic driver and reporter mouse lines with enhanced brain-cell-type targeting and functionality. Cell 174, 465–480 (2018).
Franco, S. J. et al. Fate-restricted neural progenitors in the mammalian cerebral cortex. Science 337, 746–749 (2012).
Harris, J. A. et al. Anatomical characterization of Cre driver mice for neural circuit mapping and manipulation. Front. Neural Circuits 8, 1–16 (2014).
Gorski, J. A. et al. Cortical excitatory neurons and glia, but not GABAergic neurons, are produced in the Emx1-expressing lineage. J. Neurosci. 22, 6309–6314 (2002).
Taniguchi, H. et al. A resource of Cre driver lines for genetic targeting of GABAergic neurons in cerebral cortex. Neuron 71, 995–1013 (2011).
Dhillon, H. et al. Leptin directly activates SF1 neurons in the VMH, and this action by leptin is required for normal body-weight homeostasis. Neuron 49, 191–203 (2006).
Gerfen, C. R., Paletzki, R. & Heintz, N. GENSAT BAC Cre-recombinase driver lines to study the functional organization of cerebral cortical and basal ganglia circuits. Neuron 80, 1368–1383 (2013).
Guo, C. et al. Fezf2 expression identifies a multipotent progenitor for neocortical projection neurons, astrocytes, and oligodendrocytes. Neuron 80, 1167–1174 (2013).
Gong, S. et al. Targeting Cre recombinase to specific neuron populations with bacterial artificial chromosome constructs. J. Neurosci. 27, 9817–9823 (2007).
Kalatsky, V. A. & Stryker, M. P. New paradigm for optical imaging: temporally encoded maps of intrinsic signal. Neuron 38, 529–545 (2003).
Garrett, M. E., Nauhaus, I., Marshel, J. H. & Callaway, E. M. Topography and areal organization of mouse visual cortex. J. Neurosci. 34, 12587–12600 (2014).
Steinmetz, N. A. et al. Aberrant cortical activity in multiple GCaMP6-expressing transgenic mouse lines. eNeuro 4, ENEURO.0207-17.2017 (2017).
Peirce, J. W. Generating stimuli for neuroscience using PsychoPy. Front. Neuroinform. 2, 10 (2008).
Peirce, J. W. PsychoPy-Psychophysics software in Python. J. Neurosci. Methods 162, 8–13 (2007).
Martin, D., Fowlkes, C., Tal, D. & Malik, J. A database of human segmented natural images and its application to evaluating segmentation algorithms and measuring ecological statistics. Proc. Eighth IEEE Int. Conf. Comput. Vision 2001 2, 416–423 (2001).
van Hateren, J. H. & Van Der Schaaf, A. Independent component filters of natural images compared with simple cells in primary visual cortex. Proc. Biol. Sci. 265, 359–366 (1998).
Olmos, A. & Kingdom, F. A. A. A biologically inspired algorithm for the recovery of shading and reflectance images. Perception 33, 1463–1473 (2004).
Oh, S. W. et al. A mesoscale connectome of the mouse brain. Nature 508, 207–214 (2014).
Jewell, S. & Witten, D. Exact spike train inference via ℓ0 optimization. Ann. Appl. Stat. 12, 2457–2482 (2018).
Oliphant, T. E. A Guide to NumPy (Trelgol Publishing, 2006).
Jones, E., Oliphant, T., Peterson, P., et al. SciPy: Open source scientific tools for Python. http://www.scipy.org/ (2001).
McKinney, W. Data structures for statistical computing in Python. Proceedings of the 9th Python in Science Conference (eds van der Walt, S. & Millman, J.) 51–56 (2010).
Hunter, J. D. Matplotlib: a 2D graphics environment. Comput. Sci. Eng. https://doi.org/10.1109/MCSE.2007.55 (2017).
Schoppe, O., Harper, N. S., Willmore, B. D. B., King, A. J. & Schnupp, J. W. H. Measuring the performance of neural models. Front. Comput. Neurosci. 10, 1–11 (2016).
Kay, K. N., Naselaris, T., Prenger, R. J. & Gallant, J. L. Identifying natural images from human brain activity. Nature 452, 352–355 (2008).
Nishimoto, S. et al. Reconstructing visual experiences from brain activity evoked by natural movies. Curr. Biol. 21, 1641–1646 (2011).
Willmore, B. D. B., Prenger, R. J. & Gallant, J. L. Neural representation of natural images in visual area V2. J. Neurosci. 30, 2102–2114 (2010).
Friedman, J. H. & Popescu, B. E. Predictive learning via rule ensembles. Ann. Appl. Stat. 2, 916–954 (2008).
Riedmiller, M. & Braun, H. RPROP—a fast adaptive learning algorithm. Proceedings of the International Symposium on Computer and Information Science VII (1992).
Teeters, J. L. et al. Neurodata Without Borders: creating a common data format for neurophysiology. Neuron 88, 629–634 (2015).
Acknowledgements
We thank the Animal Care, Transgenic Colony Management, and Lab Animal Services for mouse husbandry. We thank Z. J. Huang of Cold Spring Harbor Laboratory for use of the Fezf2-CreER line. We thank D. Denman, J. Siegle, Y. Billeh, and A. Arkhipov for critical feedback on the manuscript. This work was supported by the Allen Institute and in part by NSF DMS-1514743 (E.S.B.), the Falconwood Foundation (C.K.), the Center for Brains, Minds & Machines funded by NSF Science and Technology Center Award CCF-1231216 (C.K.), the Natural Sciences and Engineering Research Council of Canada (S.J.), NIH grant DP5OD009145 (D.W.), NSF CAREER Award DMS-1252624 (D.W.), Simons Investigator Award in Mathematical Modeling of Living Systems (D.W.), and NIH grant 1R01EB026908-01 (M.A.B., D.W.). We thank A. Jones for providing the critical environment that enabled our large-scale team effort. We thank the Allen Institute founder, Paul G. Allen, for his vision, encouragement, and support.
Author information
Authors and Affiliations
Contributions
S.E.J.d.V., M.A.B., K.R., M.G., T.K., S.M., S.O., J.W., H.Z., C.D., L.N., A.B., J.W.P., R.C.R. and C.K. conceived of and designed the experiment. J.A.L., T.K., P.H., A.L., C.S., D.S. and C.F. built and maintained the hardware. S.E.J.d.V., J.A.L., M.A.B., G.K.O., D.F., N.C., L.K., W.W., D.W., R.V., C.B., B.B., T.D., J.G., T.G., S.J., N.K., C.L., F. Lee, F. Long, J.P., N.S., D.M.W., J.Z. and L.N. developed algorithms and software, including the SDK and website. K.R., N. Berbesque, N. Bowles, S.C., L.C., A.C., S.D.C., M.E., N.G., F.G., R.H., L.H., U.K., J.L., J.D.L., R.L., E.L., L.L., J.L., K.M., T.N., M.R., S.S., C.W. and A.W. collected data. J.A.L., P.A.G., S.E.J.d.V. and M.A.B. supervised the work. S.E.J.d.V., J.A.L., M.A.B., G.K.O., M.O., N.C., P.L., D.M., J.S., E.S.B. and R.V. analyzed data. C.T. and W.W. provided project administration. S.E.J.d.V., J.A.L. and M.A.B. wrote the paper with input from P.A.G., G.K.O., M.O., N.C., P.L., D.M., R.C.R., M.G. and C.K.
Corresponding authors
Ethics declarations
Competing interests
The authors declare no competing interests.
Additional information
Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Extended data
Extended Data Fig. 1 Spontaneous and evoked event magnitude.
a, Pawplot and box plots summarizing the mean event magnitude for neurons during the 5 minute spontaneous activity (mean luminance gray) stimulus. For a description of the visualization see Fig. 3. The box shows the quartiles of the data, and the whiskers extend to 1.5 times the interquartile range. Points outside this range are shown as outliers. See Extended Data Figure 3 for sample sizes. b, Pawplot and box plots summarizing the maximum evoked event magnitude for neurons’ responses to drifting gratings. See Extended Data Figure 3 for sample sizes.
Extended Data Fig. 2 Response visualizations.
Conventional tuning curves for drifting grating responses for one neuron. a, Direction tuning plotted at the preferred temporal frequency (4 Hz) (mean ± sem across 15 trials). Dotted line represents the mean response to the blank sweep. b, Temporal frequency tuning plotted at the preferred grating direction (270°) (mean ± sem). c, Heatmap of the direction and temporal frequency responses for cell, showing any possible interaction of direction and temporal frequency. d, All 15 trials at the preferred direction and temporal frequency, 2 second grating presentation is indicated by pink shading. The mean event magnitude is represented by intensity of the dot to the right of the trial. e, All trials are clustered, with the strongest response in the center and weaker responses on the outside. f, Clusters are plotted on a “Star plot”. Arms indicated the direction of grating motion, arcs indicate the temporal frequency of the grating, with the lowest in the center and the highest at the outside. Clusters of red dots are located at the intersection and arms and arcs, representing the trial responses at that condition. Tuning curves for static gratings for one neuron. g, Orientation tuning plotted at the preferred spatial frequency (0.04 cpd) for each of the four phases. (mean ± sem across 50 trials) Dotted line represents the mean response to the blank sweep. h, Spatial frequency tuning plotted at the preferred orientation (90°) for each of the four phases (mean ± sem). i, Heatmap of the orientation and spatial frequency at the preferred phase j, All trials at the preferred orientation, spatial frequency and phase, the 250 ms grating presentation is indicated by pink shading. The mean event magnitude is represented by the intensity of the dot to the right of the trial. k, All trials are clustered, with the strongest response in the center and weaker responses on the outside. l, Clusters are placed on a “Fan plot”. Arms represent the orientation and arcs represent the spatial frequency of the grating. At each intersection, there are four lobes of clustered dots, one for each phase at that grating condition. Responses to natural scenes for one neuron. m, Responses to each image presented (mean ± sem across 50 trials). Dotted line represents the mean response to the blank sweep. n, All trials of the image which elicited the largest mean response, the 250ms image presentation is indicated by pink shading. The mean event magnitude is represented by the intensity of the dot to the right of the trial. Trials are sorted o, and are plotted on a “Corona plot” p, Each ray represents the response to one image, with the strongest response on the inside and weaker responses at the outside. Responses to natural movies for one neuron. q, Responses of one neuron’s response to each of 10 trials of the natural movie. r, Responses are plotted on a “Track plot”. Each red ring represents the activity of the cell to one trial, proceed clockwise from the top of the track. The outer blue track represents the mean response across all ten trials.
Extended Data Fig. 3 Responsiveness to drifting gratings.
a, Table summarizing the numbers of experiments (expts) and neurons imaged for each Cre line, layer, area combination in response to drifting grating stimulus and the number, and percent, of neurons that were responsive to the drifting grating stimulus. b, Strip plots of the percent of neurons responsive to the drifting grating stimulus for each experiment.
Extended Data Fig. 4 Responsiveness to static gratings.
a, Table summarizing the numbers of experiments and neurons imaged for each Cre line, layer, area combination in response to static grating stimulus and the number, and percent, of neurons that were responsive to the static grating stimulus. b, Strip plots of the percent of neurons responsive to the static grating stimulus for each experiment.
Extended Data Fig. 5 Responsiveness to locally sparse noise.
a, Table summarizing the numbers of experiments (expts) and neurons imaged for each Cre line, layer, area combination in response to locally sparse noise stimulus and the number, and percent, of neurons that were responsive to the locally sparse noise stimulus. b, Strip plots of the percent of neurons responsive to the locally sparse noise stimulus for each experiment.
Extended Data Fig. 6 Responsiveness to natural scenes.
a, Table summarizing the numbers of experiments (expts) and neurons imaged for each Cre line, layer, area combination in response to locally sparse noise stimulus and the number, and percent, of neurons that were responsive to the locally sparse noise stimulus. b, Strip plots of the percent of neurons responsive to the locally sparse noise stimulus for each experiment.
Extended Data Fig. 7 Responsiveness to natural movies.
a, Table summarizing the numbers of experiments (expts) and neurons imaged for each Cre line, layer, area combination in response to any of the natural movie stimuli and the number, and percent, of neurons that were responsive to the natural movie stimuli. b, Strip plots of the percent of neurons responsive to the natural movie stimuli for each experiment.
Extended Data Fig. 8 Populations for running correlation analysis.
Table summarizing the number of experiments and neurons, for each Cre line, layer, area combination, included in the running correlation analysis. These are from sessions in which the mouse was running between 20–80% of the time.
Extended Data Fig. 9 Populations for wavelet model analysis.
Table summarizing the number of experiments and neurons for each Cre line, layer, area combination for which wavelet models were fit. The neurons had to be present in all three imaging sessions to be included.
Supplementary information
Supplementary Information
Supplementary Figs. 1–24.
Supplementary Table 1
Table of transgenic lines. Description of transgenic driver and reporter lines used in this study.
Rights and permissions
About this article
Cite this article
de Vries, S.E.J., Lecoq, J.A., Buice, M.A. et al. A large-scale standardized physiological survey reveals functional organization of the mouse visual cortex. Nat Neurosci 23, 138–151 (2020). https://doi.org/10.1038/s41593-019-0550-9
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1038/s41593-019-0550-9
This article is cited by
-
Learnable latent embeddings for joint behavioural and neural analysis
Nature (2023)
-
Associations between in vitro, in vivo and in silico cell classes in mouse primary visual cortex
Nature Communications (2023)
-
Expectation violations enhance neuronal encoding of sensory information in mouse primary visual cortex
Nature Communications (2023)
-
DeepSlice: rapid fully automatic registration of mouse brain imaging to a volumetric atlas
Nature Communications (2023)
-
Activity in primate visual cortex is minimally driven by spontaneous movements
Nature Neuroscience (2023)