Skip to main content

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

  • Article
  • Published:

Triple dissociation of visual, auditory and motor processing in mouse primary visual cortex

Abstract

Primary sensory cortices respond to crossmodal stimuli—for example, auditory responses are found in primary visual cortex (V1). However, it remains unclear whether these responses reflect sensory inputs or behavioral modulation through sound-evoked body movement. We address this controversy by showing that sound-evoked activity in V1 of awake mice can be dissociated into auditory and behavioral components with distinct spatiotemporal profiles. The auditory component began at approximately 27 ms, was found in superficial and deep layers and originated from auditory cortex. Sound-evoked orofacial movements correlated with V1 neural activity starting at approximately 80–100 ms and explained auditory frequency tuning. Visual, auditory and motor activity were expressed by different laminar profiles and largely segregated subsets of neuronal populations. During simultaneous audiovisual stimulation, visual representations remained dissociable from auditory-related and motor-related activity. This three-fold dissociability of auditory, motor and visual processing is central to understanding how distinct inputs to visual cortex interact to support vision.

This is a preview of subscription content, access via your institution

Access options

Buy this article

Prices may be subject to local taxes which are calculated during checkout

Fig. 1: Task-dependent recruitment of sound-evoked activity in visual cortex upon behavioral relevance.
Fig. 2: Frequency-tuned activity in visual cortex correlates with sound and sound-evoked movement.
Fig. 3: Temporal and spatial dissociation of auditory-related and motor-related activity in auditory and visual cortex.
Fig. 4: Stimulus-evoked firing activity results from distinct sensory and motor activity profiles across layers of visual cortex.
Fig. 5: Muscimol in auditory cortex impairs auditory change detection and early evoked activity in visual cortex.
Fig. 6: Preserved orientation coding in V1 during multisensory trials showing auditory behavioral dominance.

Similar content being viewed by others

Data availability

All behavioral and neural data related to this study are openly available on FigShare (https://doi.org/10.21942/uva.24593031 and https://figshare.com/articles/dataset/oudelohuis-et-al-2024-natneurosci-data_rar/24593031). Source data are provided with this paper.

Code availability

All code related to this study is publicly available on GitHub (https://github.com/matthijsoudelohuis/oudelohuis-et-al-2024-natneurosci) and on Zenodo (https://doi.org/10.5281/zenodo.10160778 and https://zenodo.org/records/10160779).

References

  1. Kayser, C. & Logothetis, N. K. Do early sensory cortices integrate cross-modal information? Brain Struct. Funct. 212, 121–132 (2007).

    Article  PubMed  Google Scholar 

  2. Meijer, G. T., Mertens, P. E. C., Pennartz, C. M. A., Olcese, U. & Lansink, C. S. The circuit architecture of cortical multisensory processing: distinct functions jointly operating within a common anatomical network. Prog. Neurobiol. 174, 1–15 (2019).

    Article  PubMed  Google Scholar 

  3. Stringer, C. et al. Spontaneous behaviors drive multidimensional, brainwide activity. Science 364, 255–255 (2019).

    Article  PubMed  PubMed Central  Google Scholar 

  4. Ghazanfar, A. A. & Schroeder, C. E. Is neocortex essentially multisensory? Trends Cogn. Sci. 10, 278–285 (2006).

    Article  PubMed  Google Scholar 

  5. Fetsch, C. R., DeAngelis, G. C. & Angelaki, D. E. Bridging the gap between theories of sensory cue integration and the physiology of multisensory neurons. Nat. Rev. Neurosci. 14, 429–442 (2013).

    Article  CAS  PubMed  Google Scholar 

  6. Pennartz, C. M. The Brain’s Representational Power: On Consciousness and the Integration of Modalities (MIT Press, 2015).

  7. Pennartz, C. M. A. Identification and integration of sensory modalities: neural basis and relation to consciousness. Conscious. Cogn. 18, 718–739 (2009).

    Article  PubMed  Google Scholar 

  8. Budinger, E., Heil, P., Hess, A. & Scheich, H. Multisensory processing via early cortical stages: connections of the primary auditory cortical field with other sensory systems. Neuroscience 143, 1065–1083 (2006).

    Article  CAS  PubMed  Google Scholar 

  9. Campi, K. L., Bales, K. L., Grunewald, R. & Krubitzer, L. Connections of auditory and visual cortex in the prairie vole (Microtus ochrogaster): evidence for multisensory processing in primary sensory areas. Cereb. Cortex 20, 89–108 (2010).

    Article  PubMed  Google Scholar 

  10. Falchier, A., Clavagnier, S., Barone, P. & Kennedy, H. Anatomical evidence of multimodal integration in primate striate cortex. J. Neurosci. 22, 5749–5759 (2002).

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  11. Miller, M. W. & Vogt, B. A. Direct connections of rat visual cortex with sensory, motor, and association cortices. J. Comp. Neurol. 226, 184–202 (1984).

    Article  CAS  PubMed  Google Scholar 

  12. Paperna, T. & Malach, R. Patterns of sensory intermodality relationships in the cerebral cortex of the rat. J. Comp. Neurol. 308, 432–456 (1991).

    Article  CAS  PubMed  Google Scholar 

  13. Ibrahim et al. Cross-modality sharpening of visual cortical processing through layer-1-mediated inhibition and disinhibition. Neuron 89, 1031–1045 (2016).

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  14. Iurilli, G. et al. Sound-driven synaptic inhibition in primary visual cortex. Neuron 73, 814–828 (2012).

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  15. Mesik, L., Huang, J. J., Zhang, L. I. & Tao, H. W. Sensory- and motor-related responses of layer 1 neurons in the mouse visual cortex. J. Neurosci. 39, 10060–10070 (2019).

    Article  PubMed  PubMed Central  Google Scholar 

  16. Rockland, K. S. & Ojima, H. Multisensory convergence in calcarine visual areas in macaque monkey. Int. J. Psychophysiol. 50, 19–26 (2003).

    Article  PubMed  Google Scholar 

  17. Meijer, G. T., Montijn, J. S., Pennartz, C. M. A. & Lansink, C. S. Audio-visual modulation in mouse V1 depends on cross-modal stimulus configuration and congruency. J. Neurosci. 37, 8783–8796 (2017).

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  18. Fishman, M. C. & Michael, C. R. Integration of auditory information in the cat’s visual cortex. Vis. Res. 13, 1415–1419 (1973).

    Article  CAS  PubMed  Google Scholar 

  19. Morrell, F. Visual system’s view of acoustic space. Nature 238, 44–46 (1972).

    Article  CAS  PubMed  Google Scholar 

  20. Knöpfel, T. et al. Audio-visual experience strengthens multisensory assemblies in adult mouse visual cortex. Nat. Commun. 10, 5684 (2019).

    Article  PubMed  PubMed Central  Google Scholar 

  21. Fu, Y. et al. A cortical circuit for gain control by behavioral state. Cell 156, 1139–1152 (2014).

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  22. Niell, C. M. & Stryker, M. P. Modulation of visual responses by behavioral state in mouse visual cortex. Neuron 65, 472–479 (2010).

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  23. Vinck, M., Batista-Brito, R., Knoblich, U. & Cardin, J. A. Arousal and locomotion make distinct contributions to cortical activity patterns and visual encoding. Neuron 86, 740–754 (2015).

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  24. Bouvier, G., Senzai, Y. & Scanziani, M. Head movements control the activity of primary visual cortex in a luminance-dependent manner. Neuron 108, 500–511 (2020).

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  25. Guitchounts, G., Masís, J., Wolff, S. B. E. & Cox, D. Encoding of 3D head orienting movements in the primary visual cortex. Neuron 108, 512–525 (2020).

    Article  CAS  PubMed  Google Scholar 

  26. Musall, S., Kaufman, M. T., Juavinett, A. L., Gluf, S. & Churchland, A. K. Single-trial neural dynamics are dominated by richly varied movements. Nat. Neurosci. 22, 1677–1686 (2019).

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  27. Zagha, E. et al. The importance of accounting for movement when relating neuronal activity to sensory and cognitive processes. J. Neurosci. 42, 1375–1382 (2022).

  28. Bimbard, C. et al. Behavioral origin of sound-evoked activity in mouse visual cortex. Nat. Neurosci. 26, 251–258 (2023).

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  29. Spinelli, D. N., Starr, A. & Barrett, T. W. Auditory specificity in unit recordings from cat’s visual cortex. Exp. Neurol. 22, 75–84 (1968).

    Article  CAS  PubMed  Google Scholar 

  30. Niell, C. M. & Stryker, M. P. Highly selective receptive fields in mouse visual cortex. J. Neurosci. 28, 7520–7536 (2008).

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  31. Williams, A. M., Angeloni, C. F. & Geffen, M. N. Sound improves neuronal encoding of visual stimuli in mouse primary visual cortex. J. Neurosci. 43, 2885–2906 (2023).

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  32. Henschke, J. U., Noesselt, T., Scheich, H. & Budinger, E. Possible anatomical pathways for short-latency multisensory integration processes in primary sensory cortices. Brain Struct. Funct. 220, 955–977 (2015).

    Article  PubMed  Google Scholar 

  33. Schnabel, U. H. et al. Figure-ground perception in the awake mouse and neuronal activity elicited by figure-ground stimuli in primary visual cortex. Sci. Rep. 8, 17800 (2018).

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  34. Park, I. M., Meister, M. L. R., Huk, A. C. & Pillow, J. W. Encoding and decoding in parietal cortex during sensorimotor decision-making. Nat. Neurosci. 17, 1395–1403 (2014).

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  35. Steinmetz, N. A., Zatka-Haas, P., Carandini, M. & Harris, K. D. Distributed coding of choice, action and engagement across the mouse brain. Nature 576, 266–273 (2019).

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  36. Xiong, X. R. et al. Auditory cortex controls sound-driven innate defense behaviour through corticofugal projections to inferior colliculus. Nat. Commun. 6, 7224 (2015).

    Article  CAS  PubMed  Google Scholar 

  37. Huffman, R. F. & Henson, O. W. The descending auditory pathway and acousticomotor systems: connections with the inferior colliculus. Brain Res. Rev. 15, 295–323 (1990).

    Article  CAS  PubMed  Google Scholar 

  38. Meijer, G. T., Pie, J. L., Dolman, T. L., Pennartz, C. M. A. & Lansink, C. S. Audiovisual integration enhances stimulus detection performance in mice. Front. Behav. Neurosci. 12, 231 (2018).

    Article  PubMed  PubMed Central  Google Scholar 

  39. Song, Y.-H. et al. A neural circuit for auditory dominance over visual perception. Neuron 93, 940–954 (2017).

    Article  CAS  PubMed  Google Scholar 

  40. Garner, A. R. & Keller, G. B. A cortical circuit for audio-visual predictions. Nat. Neurosci. 25, 98–105 (2022).

    Article  CAS  PubMed  Google Scholar 

  41. Sakata, S. & Harris, K. D. Laminar structure of spontaneous and sensory-evoked population activity in auditory cortex. Neuron 64, 404–418 (2009).

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  42. Koch, M. & Schnitzler, H. U. The acoustic startle response in rats—circuits mediating evocation, inhibition and potentiation. Behav. Brain Res. 89, 35–49 (1997).

    Article  CAS  PubMed  Google Scholar 

  43. Miller, E. A. et al. Robust and replicable measurement for prepulse inhibition of the acoustic startle response. Mol. Psychiatry 26, 1909–1927 (2021).

    Article  PubMed  Google Scholar 

  44. Ceballo, S., Piwkowska, Z., Bourg, J., Daret, A. & Bathellier, B. Targeted cortical manipulation of auditory perception. Neuron 104, 1168–1179 (2019).

    Article  PubMed  PubMed Central  Google Scholar 

  45. Budinger, E. & Scheich, H. Anatomical connections suitable for the direct processing of neuronal information of different modalities via the rodent primary auditory cortex. Hear. Res. 258, 16–27 (2009).

    Article  PubMed  Google Scholar 

  46. Oh, S. W. et al. A mesoscale connectome of the mouse brain. Nature 508, 207–214 (2014).

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  47. Chou, X. et al. Contextual and cross-modality modulation of auditory cortical processing through pulvinar mediated suppression. eLife 9, e54157 (2020).

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  48. Salkoff, D. B., Zagha, E., McCarthy, E. & McCormick, D. A. Movement and performance explain widespread cortical activity in a visual detection task. Cereb. Cortex 30, 421–437 (2020).

    Article  PubMed  Google Scholar 

  49. Leinweber, M., Ward, D. R., Sobczak, J. M., Attinger, A. & Keller, G. B. A sensorimotor circuit in mouse cortex for visual flow predictions. Neuron 95, 1420–1432 (2017).

    Article  CAS  PubMed  Google Scholar 

  50. Pennartz, C. M. A., Dora, S., Muckli, L. & Lorteije, J. A. M. Towards a unified view on pathways and functions of neural recurrent processing. Trends Neurosci. 42, 589–603 (2019).

    Article  CAS  PubMed  Google Scholar 

  51. Schneider, D. M., Nelson, A. & Mooney, R. A synaptic and circuit basis for corollary discharge in the auditory cortex. Nature 513, 189–194 (2014).

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  52. Minamimoto, T. & Kimura, M. Participation of the thalamic CM-Pf complex in attentional orienting. J. Neurophysiol. 87, 3090–3101 (2002).

    Article  PubMed  Google Scholar 

  53. Van der Werf, Y. D., Witter, M. P. & Groenewegen, H. J. The intralaminar and midline nuclei of the thalamus. Anatomical and functional evidence for participation in processes of arousal and awareness. Brain Res. Rev. 39, 107–140 (2002).

    Article  PubMed  Google Scholar 

  54. Shimaoka, D., Harris, K. D. & Carandini, M. Effects of arousal on mouse sensory cortex depend on modality. Cell Rep. 22, 3160–3167 (2018).

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  55. Schneider, D. M. & Mooney, R. How movement modulates hearing. Annu. Rev. Neurosci. 41, 553–572 (2018).

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  56. Crapse, T. B. & Sommer, M. A. Corollary discharge across the animal kingdom. Nat. Rev. Neurosci. 9, 587–600 (2008).

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  57. Li, J. et al. Primary auditory cortex is required for anticipatory motor response. Cereb. Cortex 27, 3254–3271 (2017).

    Article  PubMed  Google Scholar 

  58. Coen, P., Sit, T. P. H., Wells, M. J., Carandini, M. & Harris, K. D. Mouse frontal cortex mediates additive multisensory decisions. Neuron 111, 2432–2447.e13 (2023).

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  59. Montijn, J. S., Meijer, G. T., Lansink, C. S. & Pennartz, C. M. A. Population-level neural codes are robust to single-neuron variability from a multidimensional coding perspective. Cell Rep. 16, 2486–2498 (2016).

    Article  CAS  PubMed  Google Scholar 

  60. Bizley, J. K., Nodal, F. R., Bajo, V. M., Nelken, I. & King, A. J. Physiological and anatomical evidence for multisensory interactions in auditory cortex. Cereb. Cortex 17, 2172–2189 (2007).

    Article  PubMed  Google Scholar 

  61. Stein, B. E. & Stanford, T. R. Multisensory integration: current issues from the perspective of the single neuron. Nat. Rev. Neurosci. 9, 255–266 (2008).

    Article  CAS  PubMed  Google Scholar 

  62. Nikbakht, N., Tafreshiha, A., Zoccolan, D. & Diamond, M. E. Supralinear and supramodal integration of visual and tactile signals in rats: psychophysics and neuronal mechanisms. Neuron 97, 626–639 (2018).

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  63. Meijer, G. T. et al. Neural correlates of multisensory detection behavior: comparison of primary and higher-order visual cortex. Cell Rep. 31, 107636 (2020).

    Article  CAS  PubMed  Google Scholar 

  64. Oude Lohuis, M. N. et al. Multisensory task demands temporally extend the causal requirement for visual cortex in perception. Nat. Commun. 13, 2864 (2022).

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  65. Petro, L. S., Paton, A. T. & Muckli, L. Contextual modulation of primary visual cortex by auditory signals. Philos. Trans. R. Soc. B Biol. Sci. 372, 20160104 (2017).

    Article  Google Scholar 

  66. Oude Lohuis, M. N., Marchesi, P., Pennartz, C. M. A. & Olcese, U. Functional (ir)relevance of posterior parietal cortex during audiovisual change detection. J. Neurosci. 42, 5229–5245 (2022).

    Article  PubMed  PubMed Central  Google Scholar 

  67. Shepard, R. N. Circularity in judgments of relative pitch. J. Acoust. Soc. Am. 36, 2346–2353 (1964).

    Article  Google Scholar 

  68. Henry, K. R. & Lepkowski, C. M. Evoked potential correlates of genetic progressive hearing loss: age-related changes from the ear to the inferior colliculus of C57BL/6 and CBA/J mice. Acta Otolaryngol. 86, 366–374 (1978).

    CAS  PubMed  Google Scholar 

  69. Spongr, V. P., Flood, D. G., Frisina, R. D. & Salvi, R. J. Quantitative measures of hair cell loss in CBA and C57BL/6 mice throughout their life spans. J. Acoust. Soc. Am. 101, 3546–3553 (1997).

    Article  CAS  PubMed  Google Scholar 

  70. Lauer, S. M., Schneeweiß, U., Brecht, M. & Ray, S. Visualization of cortical modules in flattened mammalian cortices. J. Vis. Exp 131, 56992 (2018).

    Google Scholar 

  71. Gămănuţ, R. et al. The mouse cortical connectome characterized by an ultra-dense cortical graph maintains specificity by distinct connectivity profiles. Neuron 97, 698–715.e10 (2018).

    Article  PubMed  PubMed Central  Google Scholar 

  72. Mathis, A. et al. DeepLabCut: markerless pose estimation of user-defined body parts with deep learning. Nat. Neurosci. 21, 1281–1289 (2018).

    Article  CAS  PubMed  Google Scholar 

  73. Rossant, C. et al. Spike sorting for large, dense electrode arrays. Nat. Neurosci. 19, 634–641 (2016).

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  74. Schmitzer-Torbert, N., Jackson, J., Henze, D., Harris, K. & Redish, A. D. Quantitative measures of cluster quality for use in extracellular recordings. Neuroscience 131, 1–11 (2005).

    Article  CAS  PubMed  Google Scholar 

  75. Vinck, M. et al. Cell-type and state-dependent synchronization among rodent somatosensory, visual, perirhinal cortex, and hippocampus CA1. Front. Syst. Neurosci. 9, 187 (2016).

    Article  PubMed  PubMed Central  Google Scholar 

  76. Bos, J. J. et al. Perirhinal firing patterns are sustained across large spatial segments of the task environment. Nat. Commun. 8, 15602 (2017).

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  77. Montijn, J. S. et al. A parameter-free statistical test for neuronal responsiveness. eLife 10, e71969 (2021).

    Article  PubMed  PubMed Central  Google Scholar 

  78. DeWeese, M. R., Wehr, M. & Zador, A. M. Binary spiking in auditory cortex. J. Neurosci. 23, 7940–7949 (2003).

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  79. Green, D. M. & Swets, J. A. Signal Detection Theory and Psychophysics (John Wiley, 1966).

  80. Ringach, D. L., Shapley, R. M. & Hawken, M. J. Orientation selectivity in macaque V1: diversity and laminar dependence. J. Neurosci. 22, 5639–5651 (2002).

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  81. Vaknin, G., DiScenna, P. G. & Teyler, T. J. A method for calculating current source density (CSD) analysis without resorting to recording sites outside the sampling volume. J. Neurosci. Methods 24, 131–135 (1988).

    Article  CAS  PubMed  Google Scholar 

  82. Logothetis, N. K., Kayser, C. & Oeltermann, A. In vivo measurement of cortical impedance spectrum in monkeys: implications for signal propagation. Neuron 55, 809–823 (2007).

    Article  CAS  PubMed  Google Scholar 

  83. Senzai, Y., Fernandez-Ruiz, A. & Buzsáki, G. Layer-specific physiological features and interlaminar interactions in the primary visual cortex of the mouse. Neuron 101, 500–513 (2019).

  84. McGinley, M. J., David, S. V. & McCormick, D. A. Cortical membrane potential signature of optimal states for sensory signal detection. Neuron 87, 179–192 (2015).

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  85. Friedman, J., Hastie, T. & Tibshirani, R. Regularization paths for generalized linear models via coordinate descent. J. Stat. Softw. 33, 1–22 (2010).

    Article  PubMed  PubMed Central  Google Scholar 

  86. Runyan, C. A., Piasini, E., Panzeri, S. & Harvey, C. D. Distinct timescales of population coding across cortex. Nature 548, 92–96 (2017).

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  87. Pedregosa, F. et al. Scikit-learn: machine learning in Python. J. Mach. Learn. Res. 12, 2825–2830 (2011).

    Google Scholar 

  88. Aarts, E., Verhage, M., Veenvliet, J. V., Dolan, C. V. & van der Sluis, S. A solution to dependency: using multilevel analysis to accommodate nested data. Nat. Neurosci. 17, 491–496 (2014).

    Article  CAS  PubMed  Google Scholar 

  89. Resulaj, A., Ruediger, S., Olsen, S. R. & Scanziani, M. First spikes in visual cortex enable perceptual discrimination. eLife 7, e34044 (2018).

    Article  PubMed  PubMed Central  Google Scholar 

  90. Zatka-Haas, P., Steinmetz, N. A., Carandini, M. & Harris, K. D. Sensory coding and the causal impact of mouse cortex in a visual decision. eLife 10, e63163 (2021).

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  91. Green, P. & MacLeod, C. J. SIMR: an R package for power analysis of generalized linear mixed models by simulation. Methods Ecol. Evol. 7, 493–498 (2016).

    Article  Google Scholar 

  92. Charan, J. & Biswas, T. How to calculate sample size for different study designs in medical research. Indian J. Psychol. Med. 35, 121–126 (2013).

    Article  PubMed  PubMed Central  Google Scholar 

  93. Douglas, R. J. & Martin, K. A. C. Neuronal circuits of the neocortex. Annu. Rev. Neurosci. 27, 419–451 (2004).

    Article  CAS  PubMed  Google Scholar 

  94. Harris, K. D. & Shepherd, G. M. G. The neocortical circuit: themes and variations. Nat. Neurosci. 18, 170–181 (2015).

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  95. Burr, D., Banks, M. S. & Morrone, M. C. Auditory dominance over vision in the perception of interval duration. Exp. Brain Res. 198, 49 (2009).

    Article  PubMed  Google Scholar 

  96. Repp, B. H. & Penel, A. Auditory dominance in temporal processing: new evidence from synchronization with simultaneous visual and auditory sequences. J. Exp. Psychol. Hum. Percept. Perform. 28, 1085–1099 (2002).

    Article  PubMed  Google Scholar 

  97. Shams, L., Kamitani, Y. & Shimojo, S. Illusions: what you see is what you hear. Nature 408, 788–788 (2000).

    Article  CAS  PubMed  Google Scholar 

Download references

Acknowledgements

We thank J. Montijn and G. Meijer for useful comments on the manuscript; D. Sridharan for providing code for the multi-alternative detection model; C. Rossant, members of the Cortex Lab (UCL) and contributors for Klusta and Phy spike sorting software; P. Frumento for advice on statistical analysis; and A. Mantzafou, K. Gawor and A. Cervàn Canton for assistance in behavioral training. This work was supported by the European Union’s Horizon 2020 Framework Program for Research and Innovation under the Specific Grant Agreement 945539 (Human Brain Project SGA3) to C.M.A.P. and by the FLAG-ERA JTC 2019 project DOMINO (co-financed by NWO) to U.O.

Author information

Authors and Affiliations

Authors

Contributions

Conceptualization: M.O.L., U.O. and C.M.A.P. Methodology, formal analysis and visualization: M.O.L. Additional analysis: P.M. Writing: M.O.L., U.O. and C.M.A.P. Supervision and funding: U.O. and C.M.A.P.

Corresponding author

Correspondence to Cyriel M. A. Pennartz.

Ethics declarations

Competing interests

The authors declare no competing interests.

Peer review

Peer review information

Nature Neuroscience thanks Brice Bathellier, Jennifer Bizley, Simon Musall and the other, anonymous, reviewer(s) for their contribution to the peer review of this work.

Additional information

Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Extended data

Extended Data Fig. 1 Details of auditory and visual stimulus design.

(a) Each auditory stimulus was composed of five pure tones at harmonic frequencies (octaves below and above other tones). The weight with which each tone contributed to the overall stimulus was taken from a Gaussian distribution across all possible tones. The example stimulus A in pink is composed of a tone of 213.25 Hz (center tone, highest weight) and two lower (at 211.25 and 212.25 Hz) and two higher harmonics (at 214.25 and 215.25 Hz). Tones followed scientific pitch and are expressed as powers of two: 213 corresponds to 8.192 kHz, and C9 in scientific pitch notation. During an auditory trial, the stimulus changed to a stimulus of five new harmonic tones with different weights (for example stimulus A to B). (b) The left polar diagram shows the circular arrangement of auditory stimuli. For each cardinal direction the insets show the tonal weights associated with these stimuli. Note how ever increasing the center tone frequency ultimately results in a circular shift back to the starting stimulus. This circularity can also be seen in panel a: going up and down half an octave from stimulus A always results in stimulus B. The auditory stimulus set is therefore circular. This feature is exploited in the Shepard illusion of eternal rise or drop in pitch. However, our stimuli were static so the illusory effect of continuously increasing or decreasing pitch was absent. The only illusory component was that half an octave change could be both experienced as an increase or decrease in pitch. This circular design of auditory stimuli mirrors the visual stimulus set (right part) with drifting gratings in all orientations. The amount of frequency change (expressed in partial octaves, red) or orientation change (expressed in degrees, blue) determined the saliency of auditory and visual changes. (c) Example stimuli during three consecutive trials. The upper spectrogram over time includes two auditory change trials. Auditory stimuli continued to be presented until the next auditory change, which could be identified based on a difference in spectral content, and experienced as a change in pitch. The example shows an easy auditory trial (salient change; stimulus A to B, half an octave) followed later by a difficult trial (subtle change; 1/32 of an octave). The lower schematic shows visual orientation over time including a visual trial. Note that the gratings were continuously drifting in the direction orthogonal to the grating orientation. An audiovisual trial would consist of a simultaneous change in both modalities (not shown). Note that this is only a schematic depiction, hence time is depicted in arbitrary units.

Extended Data Fig. 2 Sounds evoke instructed and uninstructed orofacial movements.

(a) Sounds evoke brief whisking and eye twitching movements in NE mice (example session). Upper images show heatmap of the increase in videoME overlaid on one reference frame. Lower trace shows videoME averaged over auditory trials with dots highlighting time points of upper frames. (b) Same as a, but for an example MST session. Here auditory trials not only evoked whisking and eye movements (uninstructed), but also continued instrumental licking movements as mice were rewarded for reporting auditory stimuli. (c) To further test whether the increase in motor activity was not associated with licking behavior, we continued sensory stimuli but removed the lick spout. Blocks of active trials (with lick spout, left image) and passive trials (without lick spout) were interleaved during a session. (d) The increase in videoME normally seen following visual stimuli (due to report-related licking movements) was absent during passive blocks. On the other hand, auditory stimuli continued to evoke orofacial movements during passive blocks in the absence of licking to a rewarded lick spout. These results are in line with the comparison between cohorts (Fig. 2b) where unrewarded auditory stimuli (but not unrewarded visual stimuli) still evoke orofacial movements. Motor-related confounds are thus important to control for not only in auditory behavioral tasks, but also naive animals. (e) Same as (a, b), but for auditory trials during passive blocks of an example MST session. Auditory trials continued to evoke uninstructed orofacial movements, but less prolonged due to the absence of licking movements.

Extended Data Fig. 3 Detailed orofacial movements correlate with frequency-tuned activity.

(a) Normalized firing rate and videoME over time for two example V1 neurons. Top: r = 0.61, UST mouse. Bottom: r = 0.55, MST mouse. (b) Heatmap of z-scored firing rate for simultaneously recorded V1 neurons showing a distribution of selectivity to orientations, but similar tuning to auditory frequency. Left and right are taken from different sessions (both from MST cohort). (c) To extract more detailed video information, we applied PCA to the frame-to-frame pixel intensity difference (FaceMap; Stringer et al.3) and extracted principal components that captured the most dominant movements (video PCs). Most movement was confined to snout, whisker pad and tongue regions. PC: principal component. (d) Cumulative explained variance (EV) of video motion by increasing the number of video PCs. Red shows raw EV and blue shows EV normalized to the variance explained by 500 video PCs (total number of extracted PCs). Inset shows a close-up of the EV for the first 50 video PCs. For subsequent analysis, 34 video PCs were included as they captured 80% of normalized variance explained (dotted lines). Computed from 17 excerpts, one video per video-recorded animal. Mean ± SEM. (e) Example traces of the first three PCs during individual trials of different modalities and decisions. Hit trials were associated with motor activity during lick responses and reward consumption. Data from one MST session. Gray line indicates stimulus change. (f) First three video PCs for an example session showing similar movements following changes in visual grating orientation, but variable movements following different auditory stimuli. V1 could still encode auditory features beyond what is explained by the modulatory effects of orofacial movements. We therefore tested how well we could decode stimulus identity by considering population spiking activity in V1, and compared this to detailed video analysis. A population decoder (support vector machine, SVM) was trained to discriminate auditory or visual stimulus identity using either the spiking data or these video PC values. (g) Auditory stimulus frequency could be decoded from V1 population activity. Decoding performance of decoders trained to discriminate post-change auditory frequency in single recording sessions. Horizontal dashed line indicates chance level. Line and shading indicate mean and 95% CI. (h) Same as (g), but for decoding auditory frequency from the first 34 video PCs. Auditory stimulus frequency could be decoded from video data. (i) Relationship across sessions between auditory frequency decoding performance using V1 data (x-axis) and video data (y-axis). Decoding performance was highly variable across sessions and, interestingly, strongly correlated between spiking and motor activity (R = 0.71, F(1,17) = 29.13, p = 4.49*10−5). Those sessions with frequency-selective orofacial movements thus also displayed frequency-selective population activity. Further, video decoding outperformed neural decoding (F(1,49) = 8.25, p = 0.006). Dot size scales with number of simultaneously recorded neurons for that session and dot color indicates cohort. (j) Same as (g), but for decoders trained to discriminate post-change visual orientation. Visual grating orientation could be decoded from V1 population activity. Baseline coding results from the fact that gratings jumped between the same stimuli (A/B to C/D and vice versa) and neurons showed persistent selectivity, seen in (Fig. 2i). (k) Same as (j), but for visual orientation. Visual grating orientation could not be decoded from video PCs. (l) Same as (i), but for the relationship between orientation decoding based on V1 activity versus video PCs. Decoding performance was not correlated across sessions (R = 0.15,F(1,28) = 0.67, p = 0.42) and higher for V1 spikes than for video PCs (F(1,42) = 51.14, p = 9.09*10−9). Although absolute decoding performance from these qualitatively different sources is less meaningful, the dissimilarity between modalities is striking.

Source data

Extended Data Fig. 4 Similar frequency tuning of primary visual cortical neurons during audiovisual stimulus detection.

(a) To establish whether our findings generalized beyond our change detection task, we trained animals (n = 3) to detect the presence of auditory and visual stimuli (same stimulus set as in the change detection task) and to discriminate and selectively report the modality, as in the MST task of our main change detection paradigm. Rewards were allocated upon licking to the auditory lick spout after the onset of one of eight tones, and upon licking to the visual lick spout to one of eight gratings was rewarded. (b) Performance on an example session on the detection of auditory stimuli of varying volume (left panel) and of varying contrast (right panel). Note how auditory and visual hit rates increase as a function of volume and contrast, respectively. The behavioral data was fit with the same two-alternative signal detection model as behavioral data from the change detection task. Behavioral response rates are shown as dots, model fits as lines. (c) Average psychometric fits for each mouse obtained by averaging the parameters of single session fits. (d) Raster plot and tuning curve of an example orientation-tuned V1 neuron. Upper panels show firing rate (0–200 ms) in response to eight drifting grating orientations (left) and eight compound Shepard tones with center tone spaced between 8 and 15 kHz (right). Dot and error bar show mean ± SEM across trials (n = 384 trials, ~24 trials per condition). Colored tickmarks in the lower raster plots show trial-by-trial spiking. Black tick marks indicate first lick after the stimulus. Note the classical orientation tuning expected from V1 neurons in response to full-field oriented drifting gratings. Auditory frequency tuning was not significant. (e) Same as (d), but for two V1 neurons from the same session (n = 299 trials) where the auditory response depended on the frequency components of the auditory stimulus. Note how the neurons are similarly tuned and their firing rates are associated with licking behavior as well. (f) Tuning curves for orientation and frequency for all V1 neurons (individual colored lines) from one session. Note dissimilarity in orientation tuning, but similarity in frequency tuning. (g) The signal correlation of all significantly orientation-tuned (left) and frequency-tuned (right) neurons. Signal correlations were computed as the Pearson correlation of trial-averaged tuning curves between neuronal pairs. Signal correlation was higher between frequency-tuned neurons than orientation-tuned neurons (F(1,406) = 9.50, p = 0.0022; n = 148 signal correlations from 23 orientation-tuned V1 neurons, n = 258 from 36 frequency-tuned V1 neurons). The finding that V1 neurons responded to the same frequencies (those associated with motor movement, Fig. 2j; Extended Data Fig. 3b) suggests that variability in motor variables drives tuning. Violin show data distribution, circles the median, inner boxplot edges 25th and 75th percentile and whiskers 1.5 interquartile range. **p < 0.01.

Source data

Extended Data Fig. 5 Dissociating sensory and motor coding in example V1 cells using a regression model.

(a) Each heatmap shows the firing rate over time for a subset of trials with each row representing a different trial type, and each column a different source of the firing rate. All data from MST mice. The leftmost column shows the original firing rate. The second column shows the predicted firing rate for the same trials using all predictors in the model. The remaining columns show the predicted firing rate using only a subset of the predictors. For this example neuron, the trial number explained little variability (trial number captured response drift across the session for some other neurons, not shown). Visual predictors explained an early response transient especially in Vmax trials. Auditory predictors captured an early response transient in some auditory trials (set C/D), whereas motor variables (the first 25 video PCs) captured variability across visual and auditory trial types. (b) An example neuron with strong, short-latency auditory-related firing and little motor-related activity. Same as a, but only for auditory trials and with predicted firing based on trial number and visual stimuli omitted. (c) Same as b, but for an example neuron with strong motor-related firing and little auditory-related activity. These examples (b, c) show that the trial-to-trial variability in motor activity and the variability in auditory stimulus type permit dissociation of the origin of sound-evoked activity.

Extended Data Fig. 6 Dissociated visual, auditory and motor-related activity using a regression model: overview of findings.

(a) Explained variance of V1 firing for an increasing number of video PCs. The video PCs beyond the number of selected video PCs for analysis (n = 34) contributed little to V1 variance. Mean ± SEM. (b) Distribution across neurons of explained variance of V1 firing rate by each subset of predictors using the regression model. X and Y axis positions indicate the neuron’s rank order across all V1 neurons in the amount of variance explained by visual, auditory or motor variables. Each dot is a neuron. Left, absence of a correlation between visual and motor predictors (r = 0.00, p = 0.94, Spearman rank correlation) indicates that the strength by which vision and behavior modulate V1 firing rates are close to unrelated. Middle, auditory and visual EV are unrelated across neurons (r = −0.07, p = 0.06), while auditory and motor coding were moderately correlated (r = 0.30, p = 2.52*10−17). Neurons with exclusive visual, auditory or motor coding are colored (motor - green; visual - blue; auditory - red). (c) Same as Fig. 3f, but for each of the task cohorts separately and auditory trials only. Auditory-related activity was present in all three cohorts. Sound-evoked motor-related activity was larger in the MST cohort, quantified in (Fig. 3g). (d) The absolute sound volume used in a given session (x-axis) was not related to the amount of variance in V1 firing rates explained by auditory stimulus variables in the regression model (Spearman rank correlation between sound volume and auditory-related EV, r = −0.045, p = 0.178). Each dot is a neuron. (e) Same as d, but for motor-related variance. No significant correlation was found (Spearman rank correlation between sound volume and motor-related EV, r = 0.048, p = 0.152). The results shown in panels d and e jointly suggest that the volumes we employed are too low to induce a startle response, that would have been otherwise accompanied by an increase in either auditory- or motor-related EV at volumes higher than a certain threshold. (f) Same as Fig. 3g in the main text, but now for subsamples of 108 neurons from each cohort to account for the unequal number of V1 neurons sampled across the cohorts (NE: n = 108, UST: n = 112, MST: n = 570). Note how results are consistent with those presented in the main figure. Center line, box edges, and whiskers demarcate 5th, 25th, 50th, 75th and 95th percentile of the mean auditory and motor-related sound-evoked response based on 1000 subsamples. **p < 0.01. Neurons from all cohorts were combined for panels a, b, d, e.

Source data

Extended Data Fig. 7 Cortical depth estimation in primary visual cortex using physiological markers.

(a) Close-up of a coronal section on V1 showing the electrode track stained with DiI. (b) Example distribution along the probe of spectral power (500 Hz to 5 kHz) indicative of multi-unit activity (MUA). High MUA power is characteristic of L5. Compare with87. (c) Current source density (CSD) map and LFP traces (black lines) in response to checkerboard stimulation. b and c are from the same example session. Color corresponds to CSD power. (d) Overview of electrode span across layers. Each line is one session (n = 84 sessions). Data from electrodes at depths above 0 or below 1000 µm were excluded from analyses. (e) Same as b, but for all sessions. Gray lines are individual sessions, black line the median. (f) Same as c, but averaged across all sessions.

Extended Data Fig. 8 Early sensory and late motor-related components of current-source density and cell-type spiking profiles in visual cortex.

(a) The current source density (colormap, CSD) and event-related potential (black traces, ERP) for auditory and visual stimulus changes in the same example session (MST mouse). (b) Histogram of z-scored videoME (0–500 ms post-change) across visual and auditory trials of all sessions with LFP recordings in V1 (all cohorts). To separate the contribution of motor activity to the LFP, all trials were split into ‘still’ and ‘moving’ trials based on the amount of motor activity. ‘Still’ trials had z-scored videoME between −0.5 and 0.5 and ‘moving’ trials a z-scored videoME larger than 1. (c) For each session a CSD map was constructed using either still or moving trials given the same visual stimuli. Average across n = 46 sessions (NE: 12 sessions; UST: 7; MST: 27). Visual stimuli evoked a consistent and characteristic current source density (CSD) profile with an early sink in L4 and subsequent sink-source pairs in L2/3 and L5/6, in line with earlier reports33,36,87. (d) The difference between the Vstill and Vmoving maps in c, which we interpret as mostly related to motor differences. Note how most of the motor-related CSD power is expressed after 200 ms in L2-5 and predominantly in superficial and middle layers. (e) Same as c, but for auditory trials. Note how the early sinks and sources in deep layers of the auditory CSD map in the example session of a are only partially reflected in the average. (f) Difference map of the Astill and Amoving maps in e. Note how the movement-associated CSD pattern resembles that of visual trials (d), but is generated somewhat earlier in time. (g) Absolute ERP response (in μV) averaged across cortical depth for selected trial categories. The tick marks and text denote the first time bin the LFP response is different from baseline (−500 to 0 ms) during auditory or visual trials irrespective of motor activity (p < 0.05, Wilcoxon signed rank test, Bonferroni correction). These latencies closely match spiking onset latencies (Fig. 3b). The LFP response for auditory trials can be seen to diverge between still and moving trials around 100 ms after stimulus onset and was significantly different after 243.4 ms (bootstrap test, n = 1000 resamples, p < 0.05) and after 324.4 ms for visual trials (p < 0.05) suggestive of late motor-related signals. Line and shading are mean ± SEM. (h) Laminar organization of onset latencies of visual and auditory responses in V1 (spiking data, not LFP). Top histogram shows the distribution of onset latencies of all significantly auditory responsive neurons (red) and visually responsive neurons (blue). Significance and onset latency were assessed using a binning-free algorithm, ZETA77. Spiking onset was significantly earlier for auditory versus visual stimuli (55.3 ms (31.4–108.5 ms) versus 80.3 ms (61.5–98.5 ms); median and interquartile range; F(1,411) = 5.37, p = 0.0209), similar to our earlier population-averaged approach (Fig. 3b). Bottom panel shows each neuron’s onset latency as a function of its recorded depth and cell type. If neurons are bimodally responsive they appear twice. Symbols are scaled by response magnitude. Putative pyramidal cells (broad-spiking) and putative parvalbumin expressing cells (narrow-spiking) were classified based on their waveform. L1 is mostly empty because almost no cells were recorded in that layer. Visually driven cells first began to fire significantly in the middle and superficial layers and later in deeper layers, consistent with the canonical sensory processing scheme93,94. Auditory-evoked firing started at similar latencies across layers, with many auditory responsive neurons in deep layers. Cortical depth was significantly correlated to spiking onset latency during visual trials (r = 0.71, p = 0.015, Pearson correlation), but not auditory trials (r = 0.13, p = 0.696). *p < 0.05, **p < 0.01.

Extended Data Fig. 9 Auditory cortical projections modulate superficial and deep layers of primary visual cortex.

(a) Coronal section showing bilateral AC expression of AAV2-CaMKIIa-hChR2(H134R)-eYFP (green: eYFP) in a PvCre-tdTomato mouse (magenta: tdTomato), centered at primary auditory cortex (Au1). AuV: ventral secondary auditory cortex. AuD: dorsal secondary auditory cortex. V2L: lateral secondary visual cortex. (b) Same as a, but for a flattened cortical section showing ChR2-eYFP expression in AC. (c) Close up of densely labeled projections in medial geniculate nucleus of the thalamus (MGN) confirming infection of AC (verified in one animal). (d) Reference section with the box outlining the location of close up image shown in (E). (e) Close up image of highlighted section in d showing axonal terminals in superficial L1and L5/6 (observed in animals for which coronal histology was performed, n = 2/5). (f) Schematic of the experiment verifying optogenetic excitation of AC cell bodies. (g) Raw voltage trace from an example electrode in AC during AC photostimulation, verifying effective optogenetic recruitment of local neurons. 5 mW, 10 ms pulses @ 20 Hz. (h) Schematic of the experiment to optogenetically stimulate AC cell bodies and record laminar LFP in V1. (i) CSD and LFP profile in V1 during AC photostimulation (average of n = 2 sessions in 2 animals). Note how pulsed AC stimulation gives rises to a repetitive CSD response (sink) in the superficial (<150 μm) and middle/deeper layers (500–800 μm). Vertical dashed lines indicate repeated AC stimulation. (j) Same as in i, but for an example visual checkerboard stimulation for comparative purposes. (k) The event-related potential (ERP) following photopulses (+5 to +20 ms after pulse) increases as a function of fiber power, suggesting optogenetic stimulation affects V1 LFP in a dose-dependent manner. The ERP response was obtained by averaging the absolute signal from channels over all cortical depths.

Extended Data Fig. 10 Behavioral auditory dominance in conflict trials is stable, independent of performance, and depends on relative stimulus timing.

(a) As a proxy for subjective saliency or arousal, we measured pupil dilation over time for saliency-matched auditory and visual trials. Cropped image shows pupil fit. Line and shading indicate mean ± SEM across N = 40 sessions from 9 mice. All data in this figure are from MST mice. (b) Quantification of maximal pupil dilation. The effect of modality on pupil dilation was tested in a linear mixed model with fixed effects of hit/miss, saliency, modality and random effect of mouse ID. Whether it was a hit or miss had the largest effect (F(1,7530) = 1138.95, p = 1.24*10−232), then saliency (F(1,7524) = 33.36, p = 7.97*10−9, with no effect of modality (F(1,7526) = 1.53, p = 0.2164). This supports the idea that visual and auditory conditions were matched in subjective saliency. Center line, box edges and whiskers show 10th, 25th, 50th, 75th and 90th data percentile. N = 931, 933, 926, 916, 1008, 936, 899, 984, trials in conditions from left to right. (c) Behavioral auditory dominance in an example session. Raster plots show for each trial type licks and rewards at the auditory and visual lick spout (red and blue tick marks respectively) aligned to stimulus change (t = 0). Colored zones indicate response window (0 to 1.5 s). Gray: inter-trial interval. Licks before t = 0 were spontaneous. Note how during conflict trials, auditory licks and rewards dominate. (d) Dominance index (DI) heatmap (as in Fig. 6b) for the only animal out of 17 MST mice) displaying visual dominance in its behavior. (e) A heatmap of the behavioral auditory dominance index for conditions binned based on performance (d-prime) on unimodal trials. This is in contrast to the analyses presented in the main text, where conditions were grouped based on the predetermined saliency gauged by psychophysical performance in previous sessions. The current analysis controls for changes in performance by reassigning each bin of the heatmap to d-prime levels within that session. It can be seen that performance-matched conflict trial conditions (along the diagonal) have positive dominance index values, confirming auditory dominance. (f) The saliency-matched dominance index (smDI) for conflict trials that are matched in performance to unimodal trials (conditions along the bottom-left to top-right diagonal of e) is significantly different from zero (Wilcoxon signed rank test, n = 17 mice, p = 0.030). Grey dot is mean ± SEM, *p < 0.05. In (f) to (l), each dot is the smDI of one animal. (g) Auditory dominance was stable across the session, with auditory dominance computed on the first and second half of sessions being similar (Wilcoxon signed rank test, n = 17 mice, p = 0.492). Grey dot is mean ± SEM. (h) Dominance was not correlated with visual performance (d-prime on unimodal trials of maximal visual saliency in the same sessions; r = −0.29, p = 0.26). (i) Dominance was not correlated with auditory performance (d-prime on unimodal trials of maximal auditory saliency in the same sessions; r = 0.24, p = 0.35). (j) Dominance was not correlated with mean reaction time in visual trials (r = 0.38, p = 0.14). (k) Dominance was not correlated with mean reaction time in auditory trials (r = 0.09, p = 0.73). (l) Reaction times on visual, auditory and conflict trials (n = 8.591, 7.758, 2.266 trials respectively). For conflict trials, only saliency-matched conflicts are shown (Vsub + Asub, Vthr + Athr, etc.). Conflict trials were split based on choice. Mean ± SEM. (m) We varied stimulus onset asynchrony between auditory and visual stimulus changes during conflict trials. The plot shows the percentage of auditory choice (red), visual choice (blue) or no lick (black) during saliency-matched threshold-level conflict trials as a function of stimulus onset asynchrony (SOA). A positive SOA value means that the visual change was presented first, followed by the auditory change. Mean ± SEM across trials. (n) Purple error bars show mean and standard deviation of DI as a function of stimulus-onset asynchrony. Black line and gray shading show bootstrapped cumulative Gaussian fit of DI as a function of SOA (median and 95% confidence interval). Top error bar and dotted line indicate crossover point, that is fitted µ parameter (median and 95% confidence interval). Auditory dominance reverses once the visual stimulus change precedes the auditory change by 89.9 ms (95% CI: 47.7–138.7 ms). This is close to the difference in reaction time between saliency matched auditory and visual conditions: 110.5 ms on average. In other words, when the visual change preceded the auditory change by about 90 ms, auditory dominance was halfway to reversing into visual dominance. Further advancing the visual change in time completely reversed the dominance. This may reflect a scenario in which the visual evidence has instructed the decision-making system to an extent that subjects have already committed to a motor plan (namely to lick the visual spout) before the auditory evidence may take control. Similar temporal dominance of audition over vision has been reported in humans95,96,97.

Source data

Supplementary information

Source data

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Oude Lohuis, M.N., Marchesi, P., Olcese, U. et al. Triple dissociation of visual, auditory and motor processing in mouse primary visual cortex. Nat Neurosci 27, 758–771 (2024). https://doi.org/10.1038/s41593-023-01564-5

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1038/s41593-023-01564-5

This article is cited by

Search

Quick links

Nature Briefing

Sign up for the Nature Briefing newsletter — what matters in science, free to your inbox daily.

Get the most important science stories of the day, free in your inbox. Sign up for Nature Briefing