How the brain selects appropriate sensory inputs and suppresses distractors is unknown. Given the well-established role of the prefrontal cortex (PFC) in executive function1, its interactions with sensory cortical areas during attention have been hypothesized to control sensory selection2,3,4,5. To test this idea and, more generally, dissect the circuits underlying sensory selection, we developed a cross-modal divided-attention task in mice that allowed genetic access to this cognitive process. By optogenetically perturbing PFC function in a temporally precise window, the ability of mice to select appropriately between conflicting visual and auditory stimuli was diminished. Equivalent sensory thalamocortical manipulations showed that behaviour was causally dependent on PFC interactions with the sensory thalamus, not sensory cortex. Consistent with this notion, we found neurons of the visual thalamic reticular nucleus (visTRN) to exhibit PFC-dependent changes in firing rate predictive of the modality selected. visTRN activity was causal to performance as confirmed by bidirectional optogenetic manipulations of this subnetwork. Using a combination of electrophysiology and intracellular chloride photometry, we demonstrated that visTRN dynamically controls visual thalamic gain through feedforward inhibition. Our experiments introduce a new subcortical model of sensory selection, in which the PFC biases thalamic reticular subnetworks to control thalamic sensory gain, selecting appropriate inputs for further processing.
Access optionsAccess options
Subscribe to Journal
Get full journal access for 1 year
only $3.90 per issue
All prices are NET prices.
VAT will be added later in the checkout.
Rent or Buy article
Get time limited or full article access on ReadCube.
All prices are NET prices.
We thank J. A. Movshon, W. Ma, R. W. Tsien, G. Fishell and D. Rinberg for helpful comments on the manuscript and G. J. Augustine for providing us with the SuperClomeleon construct and for helpful discussion around its use. The work was supported by the Swiss National Science Foundation (P2LAP3 151786) to R.D.W. and the Simons Foundation, the Sloan Foundation, the Brain and Behavior Research Foundation and the US National Institutes of Health (R00 NS078115) to M.M.H; M.M.H. is additionally supported by the Feldstein Medical Foundation, a Klingenstein-Simons Fellowship and a Biobehavioral Research Award for Innovative New Scientists (BRAINS) R01 (R01 MH107680) from the National Institute of Mental Health.
Extended data figures
Three trials are shown; the first is ‘attend to vision’ (left selection) the second is ‘attend to audition’ (left selection), and the third is ‘attend to vision’ (right selection). All trials are shown in normal speed and in slow-motion. The video illustrates the mechanics of the task and the impact of context (cueing) on selection.
About this article
European Journal of Neuroscience (2019)