Abstract
The mechanistic link between neural circuit activity and behavior remains unclear. While manipulating cortical activity can bias certain behaviors and elicit artificial percepts, some tasks can still be solved when cortex is silenced or removed. Here, mice were trained to perform a visual detection task during which we selectively targeted groups of visually responsive and co-tuned neurons in L2/3 of primary visual cortex (V1) for two-photon photostimulation. The influence of photostimulation was conditional on two key factors: the behavioral state of the animal and the contrast of the visual stimulus. The detection of low-contrast stimuli was enhanced by photostimulation, while the detection of high-contrast stimuli was suppressed, but crucially, only when mice were highly engaged in the task. When mice were less engaged, our manipulations of cortical activity had no effect on behavior. The behavioral changes were linked to specific changes in neuronal activity. The responses of non-photostimulated neurons in the local network were also conditional on two factors: their functional similarity to the photostimulated neurons and the contrast of the visual stimulus. Functionally similar neurons were increasingly suppressed by photostimulation with increasing visual stimulus contrast, correlating with the change in behavior. Our results show that the influence of cortical activity on perception is not fixed, but dynamically and contextually modulated by behavioral state, ongoing activity and the routing of information through specific circuits.
Similar content being viewed by others
Introduction
The perception of a sensory stimulus is modulated by the behavioral state in which it is experienced – successful detection of a stimulus can be increased during periods of arousal or attention1,2. The neural representation of the same sensory stimulus is also modulated by behavioral state3,4,5 – stimulus-evoked responses are typically enhanced when subjects are alert. The behavioral state of an animal is a latent variable and can be inferred in a number of ways. Firstly, the arousal of an animal is thought to be under control of neuromodulatory activity primarily arising from the locus coeruleus6,7, which can influence the dilation and constriction of the pupil8, with high mental effort and arousal associated with larger pupil sizes9. A second manifestation of behavioral state is the synchronization of neural activity3,10. At opposite extremes, sleep is associated with synchronized activity, while wakefulness is associated with desynchronization. The modulation of cortical responses by behavioral state11,12,13,14, task outcome15, and task demands16,17 has been extensively investigated. However, how the modulation of cortical activity by state or stimulus corresponds to the influence of the cortex on behavior has largely been studied using only correlational methods18,19. While the correlational approach has yielded important insights, the absence of cellular-level manipulations of the activity patterns means a cause-and-effect relationship between the observations cannot be established.
Classical experiments have shown that electrically stimulating specific cortical areas can bias sensory perception20,21,22,23,24,25,26 and elicit artificial percepts27,28,29,30. Optogenetic stimulation of cortex has confirmed and extended these findings31,32,33. In all of these experiments the functional identity of activated neurons was largely unknown, as was the number of activated cells. With the advent of new techniques that enable activation of a known number of functionally characterized cells, a view is emerging that perception can be initiated, or biased by, a small number of specific neurons34,35,36,37,38,39,40. However, challenging even the basic requirement of cortical neurons to solve some tasks, reversible silencing41,42,43,44,45 or permanent lesioning46,47,48,49 of cortex have produced contradictory findings about the necessity of cortical activity for perception and behavior50,51,52.
Consequently, we lack a clear mechanistic link between cortical activity, how it engages local and downstream circuits, and ultimately how and when that activity influences a behavior of interest. To reconcile these disparate results and begin building a complete picture of how and when cortical activity leads to perception, we need to investigate the perceptual influence of specific patterns of neural activity during the processing of different stimuli in a variety of behavioral tasks and states.
To probe the influence of stimulus-relevant patterns of cortical activity on local network activity and behavior, we performed two-photon population calcium imaging of a volume of L2/3 V1 while simultaneously using two-photon holographic optogenetics53,54,55,56,57 to activate specific groups of neurons in mice performing a visual contrast-varying detection task. We targeted groups of neurons based on their tuning for visual stimuli to ask whether increased neural activity in relevant neuronal ensembles leads to increased behavioral detection. This in vivo all-optical approach58,59,60,61,62,63,64,65,66,67 also allowed us to assess the functional influence of the stimulated cells on the local network. Importantly, behavioral state was continuously monitored by measuring pupil size and neuronal synchrony and was used to index each behavioral session into states of higher and lower engagement, allowing us to investigate the impact of additional cortical activity during these different behavioral states. We found that photostimulation of task-relevant cells – defined as cells that preferentially respond to the orientation of visual stimulus selected for the detection task – impacted the non-photostimulated local network in a way that depended on the functional identity of the cells and the contrast of the visual stimulus. When the visual stimulus contrast was high, responses of neurons functionally similar to the photostimulated ensemble were more strongly suppressed than other cells. When the contrast was low these neurons were suppressed less than other cells. The effect of photostimulation on the animal’s behavioral report followed a similar pattern, whereby the detection of high contrast stimuli was suppressed, but the detection of low contrast stimuli was enhanced. This behavioral effect, and the linking of cortical activity to behavior, was only present when mice were most engaged in the task. A gradual decrease in task engagement transitioned mice from a state where additional cortical activity bidirectionally influenced their behavior to one where such activity had no reliable impact.
Results
Behavioral state and task performance
To allow all-optical interrogation in visual cortex of mice performing a visually guided behavior, we co-expressed the calcium sensor GCaMP6s68,69,70 with the excitatory, somatically-restricted opsin C1V171,72 in pyramidal cells of L2/3 V1. Mice were head-fixed and trained to perform a visual stimulus detection task (Fig. 1a, b) where after withholding licks for a random interval, a water reward could be obtained for successfully licking to report the appearance of a small drifting grating patch of randomized orientation (Fig. 1c, d). Mice learned the task quickly with a maximal contrast stimulus, reaching a high level of stable performance within days (Supplementary Fig. 1). Lowering the stimulus contrast reduced performance on the task (Fig. 1e, Supplementary Fig. 1). To assess the relationship between cortical activity, behavior and behavioral state, we recorded the animal’s pupil size and neuronal synchrony throughout the behavioral session as a measure of arousal or alertness13,73,74. We observed a significant correlation between the size of the pupil and the degree of neuronal synchrony (Supplementary Fig. 2). We found that on threshold-stimulus trials (the stimulus contrast at which animals detected the stimulus on approximately half the trials), successful detections (hits) were associated with a more dilated pupil and lower neural synchrony in the period before the stimulus was presented (Fig. 1f, g; pupil diameter (normalized relative to the median of the session) on hits: +2.5 ± 10.8% vs misses: −6.6 ± 13.7%, P = 0.006 Wilcoxon signed-rank test; neuronal synchrony (average pairwise Pearson’s correlation coefficient) on hits: 0.0014 ± 0.0009 vs misses: 0.0024 ± 0.0021, P < 0.001, Wilcoxon signed-rank test). Based on this, we defined two behavioral states: one associated with a large pupil size and low neuronal synchrony, and the other with a smaller pupil size and higher neuronal synchrony (Fig. 1h). We split each session into these two states, indexing each trial by the pupil size and neuronal synchrony in the period before the stimulus appeared, and then compared the resulting psychometric curves between the two states (Fig. 1i). We found that performance was higher in the state with larger pupil size and lower neuronal synchrony – corresponding to a more-engaged state – consistent with previous reports4,75,76,77 (Fig. 1i, d-prime averaged across all contrasts in the more-engaged state vs the less-engaged state: 1.52 ± 1.13 vs 1.32 ± 1.05, P = 0.0002 Wilcoxon signed-rank test). The more engaged state was also characterized by a greater detection sensitivity (Fig. 1j, width of psychometric curve in the more-engaged state vs the less-engaged state: 7.5 ± 8.3 vs 15.0 ± 15.2, P = 0.012 Wilcoxon signed-rank test) and lower stimulus contrast threshold (Fig. 1j, threshold of psychometric function in the more-engaged state vs the less-engaged state: 3.88 ± 1.47 vs 4.59 ± 2.11, P = 0.042 Wilcoxon signed-rank test) reflecting greater arousal or engagement in the task75,78,79. Importantly, we note that the less-engaged state does not correspond to a completely disengaged state, as mice were still performing the task to a high level. This subtle but clear distinction provides the opportunity to quantitatively compare conditions under which cortical activity may or may not play a role in shaping simple behaviors.
Behavioral effects of targeted photostimulation
To test the influence of activity in a stimulus-encoding population of neurons in V1 during different behavioral states, we targeted multiple cells for two-photon optogenetic photostimulation while recording the resulting neuronal and behavioral changes. We first identified the retinotopically appropriate and co-expressing field of view and then mapped the visual stimulus-responsive and photostimulation-responsive neurons (Fig. 2a, b). We then selected the maximum number of photostimulation-responsive neurons that shared the same visual stimulus orientation preference (median ~20 neurons, Fig. 2c–e). The orientation of the visual stimulus for the detection task was chosen to match the orientation preference of this neuronal ensemble. While the mice were performing the contrast-varying detection task, we photostimulated these co-tuned ensembles on a random subset of trials with and without concurrent visual stimulus presentation (Figs. 1c, d and 2f). Intriguingly, we observed a reliable effect of photostimulation on behavior only when the animal was more engaged. In this large pupil, low synchrony state, targeted photostimulation of ~20 neurons enhanced the detection of lower contrast visual stimuli, while the same photostimulus suppressed the detection of higher contrast visual stimuli (Fig. 2f, h; width of psychometric curve in the less engaged state without vs with photostimulation: 15.0 ± 15.2 vs 13.3 ± 32.5, P = 0.130 Wilcoxon signed-rank test. Width of psychometric curve in the more engaged state without vs with photostimulation: 7.5 ± 8.3 vs 25.4 ± 51.6, P = 0.0015 Wilcoxon signed-rank test. The more-engaged state vs the less-engaged state, repeated measures ANOVA, interaction of contrast and state P = 0.009). Taken together, photostimulation acted to widen the psychometric function, reducing sensitivity80 (Fig. 2g; change in width in the more-engaged state vs the less-engaged state: 17.9 ± 49.2 vs −1.8 ± 35.8, P = 0.036 Wilcoxon signed-rank test) without changing the threshold (change in threshold in the more-engaged state vs the less-engaged state: −0.2 ± 1.6 vs −0.4 ± 2.4, P = 0.982 Wilcoxon signed-rank test). In other words, increasing the activity of stimulus responsive neurons when the animal is most engaged in the task enhanced the detection of subthreshold stimuli, but suppressed the detection of suprathreshold stimuli (see also Supplementary Fig. 3). When the animal was less engaged (but still performing the task), we did not observe a reliable effect of cortical stimulation on behavior. Thus, behavioral state gates the effect of photostimulation, and photostimulation has a bidirectional effect depending on the contrast of coincident visual stimulation. Taken together, this suggests that primary visual cortex serves different roles depending on stimulus regime and the state of the animal.
Network effects of targeted photostimulation
To explore the circuit mechanisms underlying the behavioral effects of photostimulation, we investigated the influence of photostimulation on downstream neurons. We used two-photon calcium imaging to simultaneously measure the activity in the local circuit during behavior with and without photostimulation in the two behavioral states. First, in the visual stimulus-only trials, we observed that increasing contrast increased population activity (Fig. 3a). The population average visually-evoked responses were larger in the more engaged state (Fig. 3b; average response magnitude in less-engaged vs more-engaged state: 0.029 ± 0.015 vs 0.031 ± 0.018, P < 0.0001 Wilcoxon signed-rank test; ref. 74), similar to the response modulation during locomotion12,14. These enhanced neural responses were strongest at 10% stimulus contrast and reflect the enhanced behavioral detection of the same stimuli (Fig. 1i). This neural enhancement in the engaged state held even when controlling for the different proportions of hit and miss trials (and thus the related motor or reward confounds) between the engagement states.
Does this enhanced excitability in the more engaged state underpin the gating of behavioral effect to photostimulation? We first analyzed how targeted photostimulation engages the local circuitry (Fig. 3c). We looked at the responses of either the directly targeted (Fig. 3d) or the non-targeted (background) subpopulations (Fig. 3e) on trials with and without photostimulation, across the different contrasts and behavioral states. As before, we observed increasing activity in both populations as the visual stimulus contrast increased, with more activity on average in the engaged state. When the target cells were photostimulated, their activity was enhanced considerably as expected. In addition, a behavioral state dependence was observed: photostimulation was more effective in the engaged state, with larger photostimulation-evoked responses in the targeted cells (Fig. 3d; change in activity in target cells on less-engaged vs more-engaged trials: 0.68 ± 0.19 vs 0.81 ± 0.22, P < 0.0001 Wilcoxon signed-rank test). Conversely, the average activity of the non-targeted background cells was suppressed by photostimulation, but we observed no difference in the level of suppression between the states (Fig. 3e; activity in background cells on less-engaged vs more-engaged trials: −0.005 ± 0.008 vs −0.005 ± 0.007, P = 0.231 Wilcoxon signed-rank test) suggesting that the photostimulated cells recruit similar levels of local inhibition regardless of behavioral state. We investigated the effect of titrating the number of stimulated cells and observed that increasing the number of directly stimulated cells increased the amount of suppression of other cells in the network (Supplementary Fig. 5; ref. 34). In more engaged states the excitatory/inhibitory (E/I) balance is shifted towards excitation81 consistent with the enhanced target cell response to direct photostimulation as observed in our data. However, the predominant effect of background suppression, with no difference between engagement states, confirms the dominance of inhibitory connections in cortical circuitry82,83,84,85,86,87,88,89, revealing highly effective stabilization of network activity in either state, despite the E/I balance being shifted towards excitation.
To characterize the spatial spread of responses through the local network caused by photostimulation we constructed a photostimulation-triggered spatial average of the change in activity in all cells across all photostimulation trials aligned to the nearest target stimulation site. This analysis positions the directly targeted cells at the center of the map and the other cells in the local network relative to them (Fig. 3f). The activity-enhanced cells were localized in a narrow zone around each target site, corresponding to direct photostimulation, as well as potential synaptic recruitment of other cells (Supplementary Fig. 8). We observed a pronounced annulus of suppressed cells around the directly targeted cells (Fig. 3f) producing a center-surround motif of enhancement and suppression. No difference in spatial profile of the network influence of photostimulation was seen between behavioral states.
These findings suggest that the contrast-dependent behavioral effect of stimulating L2/3 V1 neurons (Fig. 2f–h) cannot be explained simply by the influence stimulated neurons have on the overall activity level of the local circuit, because the change in overall activity (pooled over target and background populations) across contrasts and states remains relatively constant (Supplementary Fig. 6).
Functional similarity defines the network effects of targeted photostimulation
It is known that functionally similar neurons are more highly interconnected72,88,90,91, forming subnetworks which may facilitate computations to enhance the detection or discrimination of sensory stimuli92. To ask if different subnetworks of neurons respond differently across contrasts, we characterized the functional similarity of each recorded neuron relative to the photostimulated cell population. We defined similarity as the correlation of a neurons’ contrast response curve with the average contrast response curve of all target neurons. This contrast response curve describes how neurons respond to the behaviorally relevant visual stimulus of increasing contrasts of a fixed drifting orientation (discrimination of orientation of the stimulus being behaviorally irrelevant in our task). The contrast-response curve of a given neuron is linked to the visual stimulus responsivity of each cell and implicitly linked to the orientation preference (neurons preferring the displayed orientation will respond more strongly). Using the similarity of contrast curves between a background cell and the target neurons, we found that background cells which are more similar to target neurons in terms of their contrast response curves are also more similar to target neurons in terms of their visual response magnitude, trial-by-trial correlation and their orientation tuning curves (Supplementary Fig. 7A–D). We therefore refer to the contrast curve correlation between background and target cells, which captures intrinsic contrast responses as well as potential indirect network interactions, as our measure of behaviorally relevant functional similarity.
The functional similarity of background cells determines their response to target cell photostimulation. Using our measure of functional similarity we binned cells into groups of increasing similarity to the target population. We then plotted the response of each group as a function of visual contrast for visual-stimulation-only trials (Fig. 3g, left), for visual-and-photostimulation trials (Fig. 3g, middle) and the difference between the two (Fig. 3g right). Importantly, to cross-validate, we measured functional similarity using one half of all trials and measured visual and photostimulation responses using the other half. Sorting cells in this way again reveals increasing neural response magnitude as the stimulus contrast increases (Fig. 3g, left), with cells functionally similar to the target cells responding positively to the visual stimuli and functionally dissimilar cells responding negatively. However, when looking at the change in activity caused by photostimulation – which was suppression on average across the whole background population – we observed a strong stimulus contrast dependence to the suppression of the most functionally similar neurons. Neurons that are most similar to the targeted neurons, which are more responsive to the behaviorally relevant visual stimulus, were increasingly more suppressed by photostimulation as visual contrast increased, whereas functionally dissimilar neurons were not (Fig. 3g right, h). As a result, in addition to a significant effect of contrast on the photostimulation response, there was also a statistically significant interaction between visual contrast and functional similarity (2-way ANOVA grouped by contrast and similarity: effect of contrast F(4) = 11.9, P < 0.001; effect of similarity F(19) = 2.9, P < 0.001; interaction of contrast and similarity F(76) = 2.5, P < 0.001). To summarize these effects, we next measured the slope of photostimulation induced change in activity as a function of contrast and plotted it against normalized similarity (Fig. 3i. Linear fit R2 = 0.94, P < 0.001). This analysis showed that photostimulation induces a change in activity that depends systematically on how similar a neuron is to the targeted population. The more similar the responding neuron is to the targeted population, the more effective photostimulation is in suppressing it at high contrast. We observed similar contrast and functional-identity dependent patterns of influence when the similarity metric is contrast response similarity as above, orientation tuning curve similarity or the magnitude of visual responses (Supplementary Fig. 7E–J).
Linking network activity to behavior
To relate the pattern of photostimulation-mediated changes in network activity (Fig. 3g, h) to the behavioral changes (Fig. 2h), we focused on the threshold and threshold-adjacent contrast levels (reasoning that performance on the lowest contrast was unaffected by photostimulation because it is far below the detection limit, and that performance on the maximum contrast was unchanged because it is saturated). Since we showed that the effect of photostimulation on network activity depended on the functional identity of the cells in the network, we reasoned that any relationship between the network effects and behavioral effects of photostimulation must also depend on functional identity. We therefore examined each group of neurons characterized by their functional similarity individually (Fig. 4a, left). For example, we collected across all sessions the group of neurons most functionally similar to the targeted neurons and plotted the effect of photostimulation on their activity against the effect of photostimulation on behavior from those sessions (Fig. 4a, top right). We then measured the correlation of these network effects and behavioral effects, and refer to this measurement as the neural-behavioral coupling. We then repeated this analysis for each functional similarity group and plotted the neural-behavioral couplings as a function of similarity (Fig. 4a, bottom right; Fig. 4b, orange data). We performed the same analysis for data obtained in the disengaged state (Fig. 4b, blue data). We found that the neural-behavioral coupling of a given population of neurons was systematically related to the functional similarity of those neurons with the photostimulated neurons, with the coupling between neural and behavioral effects increasing as the functional similarity to the target neurons increases (mean correlation coefficient across resamples r = 0.19, P < 0.001 with respect to the shuffled distribution). Importantly, the same was not true of data obtained during the disengaged state (mean correlation coefficient across resamples r = −0.04, P = 0.279 with respect to the shuffled distribution). This indicates that the network effects of exogenous cortical activation can explain behavioral effects only when activation is performed during engaged states, and only when one interrogates the appropriate task-relevant neurons.
Discussion
We demonstrate that the causal influence of augmenting task-related activity in L2/3 of mouse V1 depends on behavioral state: exogenous stimulation only translates into an effect on behavior when the animal is highly engaged in the task at hand. When the animal is less engaged, the additional cortical activity has no consistent influence, perhaps suggesting a different population of neurons are relied upon to solve the task in this state. We also show that the artificially generated patterns of cortical activity can either help or impair detection of the visual stimulus, depending on the strength of the visual stimulus the perturbation is delivered coincident with. In other words, the activity patterns evoked by photostimulation are altered by interaction with activity patterns evoked by sensory stimulation. These findings provide new insights into how cortex can have a flexible impact on sensory-guided behavior.
Why, and how, is the influence of cortical activity gated by behavioral state? In primates, turning attention to a particular location of visual space has two effects: enhanced sensory-evoked neural responses, and improved behavioral performance93. However, it is unclear whether the enhanced neural responses reflect or drive the perceptual improvement. Our results provide causal evidence that L2/3 of mouse V1 determines the perception of a stimulus only when mice are in an engaged state. When the mice are less engaged (but still performing the task) they may resort to a more reflexive, and perhaps sub-cortical, strategy. This gating may be implemented by relative weighting of the effect of visual cortical activity on higher cortical or sub-cortical areas, perhaps through neuromodulatory circuits6. One possible candidate is the superior colliculus (SC), which is involved in processing visual information and coordinating motor output94,95. The SC and visual cortex bidirectionally interact96, with V1 projecting prominently to the SC97, modulating visually evoked responses there98 and behaviors controlled by it99,100,101,102. Indeed, inactivation of mouse SC impacts perceptual behavior103. Alternatively, gating may be implemented by differentially weighting activity arising from cortical or sub-cortical sources on a common downstream target, for example the pulvinar nuclei of the thalamus104. We observed task-related activity in cortex in the disengaged state, but our perturbation results suggest that this activity does not propagate as effectively or is not integrated as strongly downstream as in the engaged state. The more synchronized cortical oscillations in the less engaged state may impede the transmission of information between areas105,106.
Interestingly, we did not observe a state-dependence of the local network response to photostimulation – the resulting suppression of activity was similar across the behavioral states, reflecting highly effective synaptic recruitment of local inhibition83,84,86 which dominates the network response89 despite the E/I balance being shifted towards excitation in more engaged states81 (see also Fig. 3b). However, when the animal was more engaged in our task, neurons were activated more strongly by the visual stimulus (consistent with results from primates attending to a visual stimulus107,108,109), and the target cells themselves were more excited by the direct photostimulation. Therefore, in the task-engaged state, the photostimulated target cell activity and the activity of other cells that escaped local inhibition must be more effectively transmitted further downstream to unobserved areas110. Our results should inspire future work to investigate how activity propagates from L2/3 to the L5 output network and from there ultimately to other cortical, and sub-cortical, areas in different behavioral states.
What is the link between the network and the behavioral effects of photostimulation? In broad terms, as the behavioral effect of photostimulation became more suppressive, so did the network effects. While the behavioral effect at low contrast was facilitating, the local network effects were still negative on average, indicating that the strong direct drive of target cells outweighs the suppression of local or other unobserved downstream neurons. As the visual contrast increases the behavioral effect of photostimulation becomes negative, mirrored by more suppression of local neurons, suggesting that the relative contribution of the directly stimulated target cells is overridden by the greater levels of suppression in the network. Our finding that functionally similar neurons are more strongly affected by photostimulation as contrast increases, and that their activity is most strongly related to behavior, indicates that mapping the routing of information through functionally specific circuits is crucial to explaining behavior. Mechanistically, this effect may be mediated by functionally specific inhibitory connectivity111. Our core results on the network effects of targeted photostimulation (Fig. 3g–i) are consistent with a previously described like-suppresses-like motif 72. Depending on how we divide background cells into subpopulations we see varying degrees of facilitation of functionally similar cells at low contrasts (Supplementary Fig. 7E–J) as in other studies where predominantly excitatory network effects were observed37,38. Taken together our results align with predictions made by a recent modeling study112 simulating the effect of optogenetic perturbations in a realistic model of visual cortex with a range of stimulus contrasts. In terms of the behavioral relevance of visual encoding, when the stimulus contrast is high, cortex may act to reduce redundancy through like-to-like suppression and ensure an efficient code111,113,114,115,116,117,118. Additional stimulation in this condition may go beyond reducing redundancy to suppressing behaviorally useful information, leading to a reduction in performance.
Our results are consistent with a dynamic allocation of cortical resources according to attention-like behavioral states119 and thus highlight a modulatory role of cortex in a learned behavior likely also served by subcortical pathways. This perspective helps to reconcile apparently contradictory findings about the role of the cortex in behavioral tasks42,43,49,50,51. Our results are complementary to recent findings from lesion and silencing experiments50,120 suggesting that some fully learned tasks are no longer cortically dependent, but go beyond those studies in showing that the influence of cortex can change even on very short timescales. Such short time-scale modulation by internal state is reminiscent of attentional processes, which may rely on mechanisms similar to what we study here, to modify the influence of cortex over behavioral output. This gating of cortical influence on behavior by attentional or behavioral states demonstrates that the causality of cortical activity depends on behavioral context and reminds us that no brain region acts in isolation121,122,123. Accounting for stimulation-induced suppression and state-dependent gating of downstream effects as described here will likely be important for the successful operation of future brain machine interfaces (BMIs).
Methods
All experimental procedures were carried out under license from the UK Home Office in accordance with the UK Animals (Scientific Procedures) Act (1986).
Animal preparation
We used transgenic GCaMP6s mice (Emx1-Cre;CaMKIIa-tTA;Ai9469) of both sexes aged between P49 and P67 (at time of surgery). Doxycycline treatment in drinking water from birth to P49 prevented interictal activity in the Ai94 mouse line124. Animals were kept on a 12 hr light/dark cycle at a temperature of 22 °C and 62% humidity. To prepare the mice for all-optical experiments, we first excised the scalp and implanted a metal headplate. We then removed the skull and dura overlying visual cortex, injected virus encoding the opsin and implanted a chronic cranial imaging window in place of the skull. Sterile procedures were used throughout. Before surgery, mice were given a subcutaneous injection of 0.3 mg/mL buprenorphine hydrochloride (Vetergesic) and anaesthetized with isoflurane (5% for induction, 1.5% for maintenance). The scalp above the dorsal surface of the skull was removed and an aluminum headplate with a 7 mm diameter circular imaging well was fixed to the skull centered over the right monocular primary visual cortex (2.5 mm lateral and 0.5 mm anterior from lambda) using dental cement. A 4 mm diameter craniotomy was drilled inside the well of the headplate, and the dura was then carefully removed. A calibrated pipette bevelled to a sharp point (inner diameter ~15 μm) connected to a hydraulic injection system (Harvard apparatus) was used to inject small volumes of virus (AAV2/9-CaMKII-C1V1(t/t)-mRuby2-Kv2.1). The virus (stock concentration: ~6.9 ×1014 gc/ml) was diluted 10-fold in buffer solution (20 mM Tris, pH 8.0, 140 mM NaCl, 0.001% Pluronic F-68). We made ~5 insertions of the injection pipette, each site spaced by ~300 μm avoiding blood vessels. At each site we slowly lowered the pipette to a depth of 300 µm below pia and injected 150 nl of the virus solution at 50 nl/min. After each injection the pipette was left in place for a further 3 min before slowly retracting. We then press-fit a chronic window (a 3 mm coverslip bonded to the underside of a 4 mm coverslip with UV-cured optical cement, NOR-61, Norland Optical Adhesive) into the craniotomy, sealed with cyanoacrylate (Vetbond) and fixed in place with dental cement (SuperBond). Following surgery, animals were monitored and allowed to recover for at least 7 days. After recovery we began behavioral training. All-optical experiments were then performed >3 weeks post-surgery, allowing for sufficient expression levels (animals were aged P77–P171, median = P122 at time of experiments).
Behavioral training
We used an operant conditioning protocol whereby head-fixed mice were required to lick at a water spout positioned in front of them to report detection of a visual stimulus. Licks were recorded electrically. If the mice reported the presence of the stimulus correctly, a sugar water reward (10% w/v sucrose) was delivered through the water spout. The behavior hardware was controlled by custom software (PyBehaviour, https://github.com/llerussell/PyBehaviour) interfacing with an Arduino to trigger stimuli, record licks and deliver rewards. Mice had free access to food in their home cage, but access to water was limited to that acquired during the task. Mice had their weight monitored before and after daily training and were supplemented with additional water to maintain a minimum of 80% of their starting body weight. Before training, mice were habituated to handling and head restraint over 2 days. Training then took place in individual sound-dampened enclosures in which the mice were head-fixed and allowed to run on a treadmill. While not an integral part of the task design, we found that allowing mice to be free to run improved their performance in the task. Trials were triggered after mice withheld licks for 4 ± 3 s, after which a monocular visual stimulus appeared in the center of the monitor. If the mice licked at the water spout at any point within 1.5 s of the stimulus appearing a reward was delivered. In the first few days of training a reward was delivered automatically at 800 ms. Mice quickly learnt the requirements of the task and their reaction times preceded this automatic reward delivery time. After a few days the automatic reward delivery was disabled. After the stimulus and response window there was a fixed inter-trial period of 7 s before the next ‘withhold’ period was started. We also delivered randomly interleaved catch trials (no visual stimulus) to record chance rate of licking and assess accuracy in the task. Once stable performance was reached, we progressed the mice to a psychophysical variant of the task where we introduced a range of contrasts (1%, 2%, 5%, 10%, 100%) to assess their perceptual sensitivity and threshold. We found that task performance was insensitive to stimulus location on the monitor. For the final experiment the trial order was pseudo randomized to ensure that repeats of the same probe types were not immediately consecutive.
Visual stimulation
Visual stimuli were generated using custom software (using PsychoPy, ref. 125). 30° Gabor patches of drifting sinusoidal gratings (8 directions, 0 to 315° in 45° increments) with a spatial frequency of 0.04 cycles per degree and a temporal frequency of 2 Hz were presented on a monitor (typically 51.8 cm width, 32.4 cm height, 15 cm from the animals left eye covering up to ± 47° of the vertical visual field and ± 60° of the horizontal visual field), with a spherical distortion applied to correct perspective errors.
Training
During training the orientation of the stimulus was randomized on every trial and the duration of the stimulus was 1 s. Rewards were delivered if the mouse licked during the response window regardless of the orientation of visual stimulus.
Mapping orientation preference
To map orientation preference of single cells in a FOV with two-photon imaging, the visual stimuli were positioned in the retinotopically appropriate location and were presented in a randomized order with a duration of 3 s, interleaved by 5 s of mean luminance gray. If mice licked at the water spout during this mapping block a water reward was delivered.
Experiment
During the behavioral experiments with photostimulation, the visual stimuli parameters were the same as during training, except only one orientation was presented (to match the photostimulation ensemble’s preference) and the stimulus was positioned in the retinotopically appropriate location for the imaging field of view. A range of contrasts (1%, 2%, 5%, 10%, 100%) with equal trial proportions were presented to the animals.
Widefield imaging
To locate primary visual cortex, and position the experimental field of view, widefield GCaMP imaging was performed (usable FOV~2 × 2 mm). GCaMP6s fluorescence produced by one-photon excitation (470 nm LED, Thorlabs) was collected through a 5x/0.1-NA air objective (Olympus) onto a CMOS camera (Hamamatsu ORCA Flash 4.0, binned image size of 512 × 512 pixels, 20 Hz frame rate). Contrast-reversing checkerboard bars, 10° wide were drifted vertically and horizontally across a gray screen at a speed of 25°/s in an interleaved sequence. Stimulus-triggered change in fluorescence for the two different stimuli revealed areal borders and identification of primary visual cortex126. This was repeated with two-photon imaging on the day of the experiment to confirm the retinotopic location of the chosen field of view.
Two-photon population imaging
Two-photon imaging was performed with a resonant scanning microscope (Ultima II, Bruker Corporation) using a Chameleon Ultra II laser (Coherent) driven by PrairieView. A 16×/0.8-NA water-immersion objective (Nikon) was used for all experiments. An ETL (Optotune EL-10-30-TC, Gardasoft driver) was used to perform volumetric imaging, spanning a 100 μm range with 33.3 μm spacing between 4 planes. The FOV size was 710 × 710 μm at a resolution of 512 × 512 pixels. The number of cells recorded (ROIs after curation) per experiment ranged from 1266 to 4891 (mean = 2765 ± 995). The per-plane frame rate was 7 Hz (total acquisition rate 30 Hz). GCaMP6s was imaged at 920 nm and mRuby (conjugated to C1V1-Kv2.1) was imaged at 765 nm. Power on the sample was 50 mW at the shallowest plane (~150–200 μm below pia) and increased to ~85 mW at the deepest plane (~250–300 μm below pia), interpolating for intermediate planes, to equalize imaging quality across planes. To maximize imaging quality127 we calculated the tilt of the sample relative to the microscope and then rotated the objective along two axes to be perpendicular to the implanted coverslip window.
Two-photon photostimulation
Two-photon photostimulation was carried out using a fiber laser at 1030 nm (Satsuma, Amplitude Systèmes, 2 MHz repetition rate). The laser beam was split via a reflective spatial light modulator (SLM) (7.68 × 7.68 mm active area, 512 × 512 pixels, OverDrive Plus SLM, Meadowlark Optics/Boulder Nonlinear Systems) which was installed in-line of the photostimulation path (NeuraLight 3D, Bruker Corporation). Phase masks used to generate focused beamlet patterns in the sample were calculated via the weighted Gerchberg-Saxton algorithm. The targets were weighted according to their location relative to the center of the SLM’s addressable FOV to compensate for the decrease in diffraction efficiency when directing beamlets to peripheral positions. We calibrated the targeting of SLM spots in imaging space by burning arbitrary patterns with the SLM using the photostimulation laser in a fluorescent plastic slide before taking a volumetric stack of the sample with the imaging laser. We manually located the burnt spots and the corresponding affine transformation from SLM space to imaging space was computed. For 3D stimulation patterns we interpolated the transformation required from the nearest calibrated planes (Calibration code: https://github.com/llerussell/SLMTransformMaker3D). To increase stimulation efficiency, we offset the photostimulation FOV with the photostimulation galvanometers such that the center of SLM space was close to the cortical/imaging-space centroid of targeted cells. Spiral photostimulation patterns (3 rotations, 10 μm diameter, 20 ms duration) were generated by moving all beamlets simultaneously with the galvanometer mirrors. The laser power was adjusted to maintain 6 mW per target cell. Photostimulation during the behavioral task was delivered at 20 Hz (1 spiral every 50 ms) for 1 s with the same onset time as the visual stimulus.
Characterization of photostimulation resolution with pharmacological blockade
To characterize the stimulation resolution, we stimulated single cells in separate non-behavioral sessions with the same stimulation parameters as used in the experiments (15 μm spiral, 20 ms spiral duration, repeated every 50 ms (20 Hz) for 1 s). We offset the stimulation spot using the SLM by 0, 5, 10, 15, 20, 30 μm laterally and −75, −50, −25, −10, 0, +10, +25, +50, +75 μm axially (at 0 μm lateral offset). In each animal (n = 3 mice), we stimulated 8 single cells: one at a time every 2 s, for 10 repeats of each stimulation offset for each cell. In these experiments, we used animals implanted with chronic windows with a small access hole drilled in the middle, covered with a silicone plug128. By removing the plug we gained access to the brain surface (dura removed prior to window implantation), to which we could apply pharmacological agents. We applied a mixture of 1 mM NBQX and 2 mM AP5 (in IVE)129,130 to block excitatory synaptic activity to disambiguate off-target stimulation from synaptic recruitment of nearby cells.
Naparm (Near automatic photoactivation response mapping)
To find photostimulation-responsive cells we semi-automatically detected cell locations from expression images and stimulus-triggered average or pixel-correlation images (STA Movie Maker, https://github.com/llerussell/STAMovieMaker) fed into Cellpose131 and manually curated. These cell body coordinates were then clustered into equal size groups of user-determined size (between 10 and 50) and the groups were stimulated one by one. The associated phase mask, galvanometer positioning, and Pockels cell control protocol were generated with custom MATLAB software (Naparm, https://github.com/llerussell/Naparm) and executed by the photostimulation modules of the microscope software (PrairieView, Bruker Corporation) and the SLM control software (Blink, Meadowlark). For photo-responsivity mapping purposes, we used a stimulation rate of 20 Hz, for 500 ms per pattern, stimulating a different pattern every 1.5 s, and performed 8–10 trials of each pattern. These data were then analyzed online together with the visual response mapping data to extract activity traces and design stimulation ensembles (see below).
Synchronization
For subsequent synchronization during analysis, analog signals of various trigger lines were recorded with a National Instruments DAQ card, controlled by PackIO132. The recorded inputs included two-photon imaging frame pulses, photostimulation triggers, galvanometer command signals, triggers to and frame flip pulses from the visual stimulus and the SLM phase mask update. Photostimulation trials for the responsivity mapping block were triggered at a fixed rate from an output line on the DAQ card. For the online behavior experiments photostimulation and visual trials were triggered through the behavior software and hardware.
Experimental protocol
On the day of the full experiment, the following protocol was used. First, we located a region of cortex showing optimal coexpression of opsin and indicator, guided by widefield retinotopy, and confirmed the corresponding retinotopic location with two-photon imaging. After determining where to position the visual stimulus on the monitor we then presented drifting gratings of 8 different orientations while performing two-photon imaging to map orientation preferences of the recorded cells. Rewards were delivered during the visual stimuli if the mouse licked. Next, we photostimulated a large proportion of all cells in the recorded volume to find which ones were photostimulation-responsive. Finally, we designed photostimulation patterns for use in the behavior experiment (see below). We then gave the mice ~10 warm up ‘easy’ (high contrast) trials before the main behavioral experiment began. We recorded in 20 min blocks, manually correcting for any drift in imaging FOV between recordings.
Online photostimulation ensemble design
To increase the speed of data analysis immediately prior to the experiment, we streamed the raw acquisition samples to custom software (PrairieLink, RawDataStream, https://github.com/llerussell/Bruker_PrairieLink). We used this raw stream to process the pixel samples, construct imaging frames, and perform online motion correction. Processing online allowed us to directly output to a custom file format making the data immediately available for analysis. Motion-corrected movies were loaded into MATLAB (MathWorks), and traces were extracted from both the photostimulation and the visual stimulation movies, using the photostimulation targets as seed points around which circular ROIs were dilated. We subtracted a neuropil signal from the ROI signal before determining responsivity. We determined cells as photostimulation-responsive if their evoked response (in a~500 ms window after stimulus offset) to their direct stimulation was >30% ΔF/F on >50% trials. We determined cells as visually responsive by the same criteria (with response window of 2 s during the stimulus presentation), additionally specifying their preferred orientation as the stimulus that elicited the largest average response. The final stimulation pattern was then designed in each experiment. After filtering for photostimulation-responsive cells, the co-tuned group was selected, taking the largest group of photostimulation-responsive and orientation-tuned neurons (minimum number of targets: 6, maximum: 73, median: 23).
Pre-processing: Imaging frame registration, ROI segmentation and neuropil correction
For the final analysis, the raw calcium imaging movies were pre-processed using Suite2p133. The pipeline included image registration, segmentation of active region of interest (ROIs), and of local surrounding neuropil signal. The final selection of ROIs was filtered semi-automatically using anatomical criteria to include only neuronal somata and discard spurious ROIs. We manually inspected all FOVs to ensure consistent results. We subtracted a neuropil signal from every ROI signal. The contamination of the ROI signal by the neuropil signal depends on many factors, including expression levels, imaging quality, and axial sectioning by the imaging plane. We used robust linear regression to estimate the coefficient of neuropil contamination for each ROI (Supplementary Fig. 9; ref. 68). The slope of this fit was used to scale the neuropil signal before subtraction from the ROI signal, such that after subtraction there was no correlation between the ROI baseline and neuropil. Neuropil subtraction had minimal effect on the response magnitude, and negative responses to visual- and photo-stimulation were seen even without subtracting the neuropil contamination (Supplementary Figs. 9 and 10).
ROI exclusion zones
To reduce potential off-target photostimulation artifacts, we excluded from consideration all cells within a 30 μm diameter cylinder extending through all axial planes when analyzing the network response to photostimulation due to potential imaging and photostimulation artifacts (see Supplementary Fig. 9). We redefined our target stimulation pattern identities based on the ROIs segmented by Suite2p within the 30 μm lateral disk around each of the SLM target locations. We also excluded ROIs in the first 100 rows of pixels of each imaging frame due to an ETL artefact related to the settle time of the lens when changing planes.
Neuronal response metric
To measure neuronal responses, we extracted the mean fluorescence in a ~500 ms window (4 frames) starting immediately after the photostimulation ended (and/or visual stimulus to ensure comparable measurements) and subtracted the mean fluorescence in the ~1 s baseline (7 frames) before the onset of photostimulation (or visual stimulus). We divided the difference in the means by the mean of the baseline window, resulting in trial-by-trial ΔF/F values. We excluded all photostimulation frames because of the associated artefact contaminating the activity traces; the slow kinetics of GCaMP6s permit this, although the magnitude of response is underestimated as a result.
Hit:miss ratio matching
To counteract motor and reward-related confounds of licks occurring within the neural analysis window, we ensured that the ratio of hits to misses (trials with licks vs trials with no licks) were equalized across comparisons. Ensuring a 50:50 ratio of hits to misses across all trial types in a given session would result in unnecessary loss of data in trials where hits or misses made up the entirety of trial responses. The more relevant control to make is not across contrasts, but across behavioral states (the more-engaged state versus the less-engaged state) or trial type (with or without photostimulation). We therefore ensured that the proportion of hits to misses at a particular stimulus contrast were equal across a given comparison (state or photostimulation). We determined the minimum number of hits on a given contrast across the conditions and the minimum number of misses on a given contrast across conditions (e.g., at 5% contrast in both states, or at 5% contrast in the less-engaged state with and without photostimulation). We then resampled without replacement taking the minimum common number of hits and the minimum number of misses from the original trials. As the minimum number is computed across conditions, the resampled collections of trials have the same proportion of hits to misses in both conditions. We then averaged the resampled trials together and repeated this procedure 10,000 times, storing the average result for a given neurons response in that condition.
Contrast response similarity metric
To functionally characterize all recorded neurons relative to the photostimulated neurons, we compared their responses to different contrasts of the visual stimulus (only one orientation was used) presented in the final behavioral experiments. As our functional similarity measurement (correlation of average responses to visual stimuli) and photostimulation mediated change metric (average difference in responses between trials with visual stimuli and photostimulation and trials with just visual stimuli) would share the same datapoints we cross-validated. Specifically, we randomly split the dataset in half and used one half for the contrast similarity measurement and the other half for the measurement of photostimulation induced changes. For the contrast similarity metric in a given session we averaged together the photostimulated population to give a group average response curve. We then computed the Pearson’s correlation of all single cell contrast response curves with the photostimulated population average curve. For the photostimulation mediated change in activity of a given cell we averaged together responses in all trials of a particular contrast and stimulation condition. We subtracted the visual only trial responses from the joint visual and photostimulation trial responses. We binned all neurons per session into 20 evenly spaced bins ranging from maximally dissimilar to maximally similar to the target group. We computed the average response within each similarity group. We performed all analyses in Figs. 3h, i and 4 with these cross-validated metrics, repeating the procedure 5000 times with different random trial splits, computing the metric of interest within each split and reporting the average across splits.
In Supplementary Fig. 7 we also characterize neurons based on their orientation tuning curves (8 different orientations were presented in a short mapping session before the main photostimulation experiment) and computed the Pearson’s correlation of those single cell tuning curves to the photostimulated population average tuning curve.
Neural-behavioral coupling
To obtain bootstrapped statistics of the neural-behavioral coupling we performed the following shuffling and resampling procedures. We split the network into functional similarity bins and compute the cross-validated change in activity (see above). For each similarity bin we correlate the average response within group, and pooled across sessions, to the 3 intermediate contrasts (2, 5, 10%) with the change in d-prime on the same contrasts. Each similarity group from each session thus contributes 3 datapoints to the neural-behavioral coupling measurements. This correlation coefficient is termed the neural-behavioral coupling. We then performed a linear regression on the neural-behavioral couplings as a function of functional similarity to summarize the overall effect whereby most highly similar neurons have the strongest coupling with behavior. We bootstrapped the neural-behavioral coupling fit through the functional similarity groups by resampling sessions with replacement, 5000 times. We report the mean slope obtained through this resampling procedure. To ask whether the resampled distribution of slopes is different from those expected by chance given the dataset we performed a shuffling procedure to derive a null distribution against which to compare. For this, we shuffled trial contrast identity amongst the intermediate contrasts, mixing behavioral and neural responses across trials but maintaining the overall statistics within sessions. We repeated this shuffle 5000 times and within each shuffle we computed cross-validated contrast similarity and photostimulation response metrics. We averaged these responses across shuffle to obtain the final shuffled behavioral and neural responses. With these shuffled responses we performed the same neural-behavioral coupling procedure to generate a null distribution of slopes expected by chance.
Behavioral session truncation
To ensure we only analyzed periods of the behavioral session where the mice were similarly engaged and motivated, we truncated the session when the rolling average performance (20 trial sliding window) of the ‘easy’ highest contrast trials dropped below 80% of the starting performance.
Data exclusion criteria
We excluded trials if >50% of photostimulation targets failed to respond on that trial. We also excluded trials if the mice licked early (within the first 150 ms of the presentation of the visual stimulus). Whole sessions were excluded if fewer than 10 trials in any trial type remained (the median minimum number of trials per trial type (note each session has 12 trial types) in included sessions = 31 trials (range 10–56)). Out of 32 completed sessions, 3 were excluded because of poor photostimulation efficiency.
Statistical procedures
No statistical methods were used to predetermine sample size. Investigators were not blinded to allocation during experiments and outcome assessment. Summary statistics in the text are reported as mean ± SD unless otherwise indicated. Solid lines and shaded areas or error bars in plots represent mean ± SEM unless otherwise indicated. Statistical tests used are specified in the text and were generally paired, two-tailed and non-parametric. We use the following convention for representing P-values in all figures: P > = 0.05 (n.s.), P < 0.05 (*), P < 0.01 (**), P < 0.001 (***).
Pre-trial pupil size
We recorded the right pupil throughout the experiment using a Dalsa Genie Nano-M1280 camera with a Kowa LM50JC lens. The camera frames were acquired at 30 Hz triggered by pulses from the 2P imaging system. The laser light used for 2P imaging, evident through the back of the pupil, was blocked from the camera using a 900 nm shortpass filter. The animal was illuminated with an 850 nm LED. The visual stimulus monitor provided ambient light. We used DeepLabCut134 to track 11 points (6 around the circumference of the pupil, 4 around the eye lid, 1 on the nose and 1 on the tongue (only visible when the animal licked), the tongue point allowed for synchronization with the imaging and behavioral data by cross-correlating the DeepLabCut tongue signal with the electrically recorded lickometer signal. The area bounded by the 6 pupil points was used as the pupil size. We normalized the pupil size by the median size across the whole session.
Pre-trial synchrony
To compute the network synchrony prior to presentation of the visual stimulus, we used deconvolved activity traces (OASIS133,135) smoothed with a Gaussian filter (sigma = 0.5 s). We used a 4 s window immediately prior to the initiation of the trial (delivery of a stimulus, if not a catch trial) as the ‘pre-trial’ period. We then computed pairwise Pearson’s correlations between all cells within this time window and averaged together all pairwise correlation coefficients across all cells (including targets) to give the total network correlation or synchrony. We z-scored all network correlations within animal and across all trial types to facilitate across animal comparisons. When comparing hit and miss trials, we resampled 10,000 times to match trial numbers.
Pre-trial state
We used both the pupil-size and neural synchrony to produce a combined measure of state trial-by-trial. To compute this, we z-scored both variables and summed them, weighting the pupil by −1 to account for the inverse relationship between the two variables. Based on this combined state score we median split each session into two equal size subsets, corresponding to two behavioral states.
Psychometric curve fitting
We used Psignifit136 to fit a Weibull curve fixing the lambda (lapse rate) and gamma (guess rate) parameters, while allowing the estimation of alpha and beta (threshold and slope). The threshold of the curve is defined as the stimulus contrast where behavioral performance is 50%. The width of the curve is defined as the difference in stimulus units between 5% and 95% behavioral performance.
We computed d-prime for each stimulus type to control for chance rate of licking within a session using the catch trials without photostimulation as the false alarm rate. We also observe the same results when computing d-prime using the catch trials with photostimulation as the false alarm rate (Supplementary Fig. 4).
Reporting summary
Further information on research design is available in the Nature Portfolio Reporting Summary linked to this article.
Data availability
Datasets supporting the findings of this study are available from the corresponding author upon request. Source data are provided with this paper.
Code availability
Custom code used for data acquisition, photostimulation control, behavioral training and analysis have been deposited online: Naparm https://github.com/llerussell/Naparm, https://doi.org/10.5281/zenodo.10449686, PyBehaviour https://github.com/llerussell/PyBehaviour, https://doi.org/10.5281/zenodo.10449684, 3D SLM calibration https://github.com/llerussell/SLMTransformMaker3D, https://doi.org/10.5281/zenodo.10449682, STAMovieMaker https://github.com/llerussell/STAMovieMaker, https://doi.org/10.5281/zenodo.10449680, RawDataStream https://github.com/llerussell/Bruker_PrairieLink, https://doi.org/10.5281/zenodo.10449690, Objective rotation https://github.com/llerussell/MONPangle, https://doi.org/10.5281/zenodo.10449688.
References
McCormick, D. A., Nestvogel, D. B. & He, B. J. Neuromodulation of brain state and behavior. Annu Rev. Neurosci. 43, 391–415 (2020).
Raymond, J. E. Attentional modulation of visual motion perception. Trends Cogn. Sci. 4, 42–50 (2000).
Harris, K. D. & Thiele, A. Cortical state and attention. Nat. Rev. Neurosci. 12, 509–523 (2011).
McGinley, M. J. et al. Waking state: rapid variations modulate neural and behavioral responses. Neuron 87, 1143–1161 (2015).
Treue, S. Neural correlates of attention in primate visual cortex. Trends Neurosci. 24, 295–300 (2001).
Lee, S. H. & Dan, Y. Neuromodulation of brain states. Neuron 76, 209–222 (2012).
Sara, S. J. The locus coeruleus and noradrenergic modulation of cognition. Nat. Rev. Neurosci. 10, 211–223 (2009).
Joshi, S., Li, Y., Kalwani, R. M. & Gold, J. I. Relationships between pupil siameter and neuronal activity in the locus coeruleus, colliculi, and cingulate cortex. Neuron 89, 221–234 (2016).
Hess, E. H. & Polt, J. M. Pupil size in relation to mental activity during simple problem-solving. Science 143, 1190–1192 (1964).
McCormick, D. A., McGinley, M. J. & Salkoff, D. B. Brain state dependent activity in the cortex and thalamus. Curr. Opin. Neurobiol. 31, 133–140 (2015).
Goard, M. & Dan, Y. Basal forebrain activation enhances cortical coding of natural scenes. Nat. Neurosci. 12, 1444–1449 (2009).
Niell, C. M. & Stryker, M. P. Modulation of visual responses by behavioral state in mouse visual cortex. Neuron 65, 472–479 (2010).
Vinck, M., Batista-Brito, R., Knoblich, U. & Cardin, J. A. Arousal and locomotion make distinct contributions to cortical activity patterns and visual encoding. Neuron 86, 740–754 (2015).
Polack, P. O., Friedman, J. & Golshani, P. Cellular mechanisms of brain state-dependent gain modulation in visual cortex. Nat. Neurosci. 16, 1331–1339 (2013).
Britten, K. H., Shadlen, M. N., Newsome, W. T. & Movshon, J. A. The analysis of visual motion: a comparison of neuronal and psychophysical performance. J. Neurosci. 12, 4745–4765 (1992).
Chen, Y. et al. Task difficulty modulates the activity of specific neuronal populations in primary visual cortex. Nat. Neurosci. 11, 974–982 (2008).
Boudreau, C. E., Williford, T. H. & Maunsell, J. H. Effects of task difficulty and target likelihood in area V4 of macaque monkeys. J. Neurophysiol. 96, 2377–2387 (2006).
Assad, J. A. Neural coding of behavioral relevance in parietal cortex. Curr. Opin. Neurobiol. 13, 194–197 (2003).
Cohen, M. R. & Maunsell, J. H. Attention improves performance primarily by reducing interneuronal correlations. Nat. Neurosci. 12, 1594–1600 (2009).
Salzman, C. D., Britten, K. H. & Newsome, W. T. Cortical microstimulation influences perceptual judgements of motion direction. Nature 346, 174–177 (1990).
Salzman, C. D., Murasugi, C. M., Britten, K. H. & Newsome, W. T. Microstimulation in visual area MT: effects on direction discrimination performance. J. Neurosci. 12, 2331–2355 (1992).
Seidemann, E., Zohary, E. & Newsome, W. T. Temporal gating of neural signals during performance of a visual discrimination task. Nature 394, 72–75 (1998).
DeAngelis, G. C., Cumming, B. G. & Newsome, W. T. Cortical area MT and the perception of stereoscopic depth. Nature 394, 677–680 (1998).
Ditterich, J., Mazurek, M. E. & Shadlen, M. N. Microstimulation of visual cortex affects the speed of perceptual decisions. Nat. Neurosci. 6, 891–898 (2003).
Afraz, S. R., Kiani, R. & Esteky, H. Microstimulation of inferotemporal cortex influences face categorization. Nature 442, 692–695 (2006).
Cicmil, N. & Krug, K. Playing the electric light orchestra–how electrical stimulation of visual cortex elucidates the neural basis of perception. Philos. Trans. R. Soc. Lond. B Biol. Sci. 370, 20140206 (2015).
Romo, R., Hernandez, A., Zainos, A. & Salinas, E. Somatosensory discrimination based on cortical microstimulation. Nature 392, 387–390 (1998).
Murphey, D. K. & Maunsell, J. H. Behavioral detection of electrical microstimulation in different cortical visual areas. Curr. Biol. 17, 862–867 (2007).
Bartlett, J. R. & Doty, R. W. An exploration of the ability of macaques to detect microstimulation of striate cortex. Acta Neurobiol. Exp. 40, 713–727 (1980).
Houweling, A. R. & Brecht, M. Behavioural report of single neuron stimulation in somatosensory cortex. Nature 451, 65–68 (2008).
Huber, D. et al. Sparse optical microstimulation in barrel cortex drives learned behaviour in freely moving mice. Nature 451, 61–64 (2008).
Sachidhanandam, S., Sreenivasan, V., Kyriakatos, A., Kremer, Y. & Petersen, C. C. Membrane potential correlates of sensory perception in mouse barrel cortex. Nat. Neurosci. 16, 1671–1677 (2013).
Histed, M. H. & Maunsell, J. H. Cortical neural populations can guide behavior by integrating inputs linearly, independent of synchrony. Proc. Natl Acad. Sci. USA 111, E178–E187 (2014).
Dalgleish, H. W. et al. How many neurons are sufficient for perception of cortical activity? eLife 9, https://doi.org/10.7554/eLife.58889 (2020).
Gill, J. V. et al. Precise holographic manipulation of olfactory circuits reveals coding features determining perceptual detection. Neuron 108, 382–393 e385 (2020).
Daie, K., Svoboda, K. & Druckmann, S. Targeted photostimulation uncovers circuit motifs supporting short-term memory. Nat. Neurosci., https://doi.org/10.1038/s41593-020-00776-3. (2021).
Carrillo-Reid, L., Han, S., Yang, W., Akrouh, A. & Yuste, R. Controlling visually guided behavior by holographic recalling of cortical ensembles. Cell, https://doi.org/10.1016/j.cell.2019.05.045 (2019).
Marshel, J. H. et al. Cortical layer-specific critical dynamics triggering perception. Science 365, https://doi.org/10.1126/science.aaw5202. (2019).
Robinson, N. T. M. et al. Targeted activation of hippocampal place cells drives memory-guided spatial behavior. Cell 183, 2041–2042 (2020).
Tanke, N., Borst, J. G. G. & Houweling, A. R. Single-cell stimulation in barrel cortex influences psychophysical detection performance. J. Neurosci. 38, 2057–2068 (2018).
Lee, S. H. et al. Activation of specific interneurons improves V1 feature selectivity and visual perception. Nature 488, 379–383 (2012).
Glickfeld, L. L., Histed, M. H. & Maunsell, J. H. Mouse primary visual cortex is used to detect both orientation and contrast changes. J. Neurosci. 33, 19416–19422 (2013).
Resulaj, A., Ruediger, S., Olsen, S. R. & Scanziani, M. First spikes in visual cortex enable perceptual discrimination. Elife 7, https://doi.org/10.7554/eLife.34044. (2018).
Goldbach, H. C., Akitake, B., Leedy, C. E. & Histed, M. H. Performance in even a simple perceptual task depends on mouse secondary visual areas. Elife 10, https://doi.org/10.7554/eLife.62156. (2021).
Jin, M. & Glickfeld, L. L. Mouse higher visual areas provide both distributed and specialized contributions to visually guided behaviors. Curr. Biol. 30, 4682–4692 e4687 (2020).
Lashley, K. S. Mass action in cerebral function. Science 73, 245–254 (1931).
Schneider, G. E. Two visual systems. Science 163, 895–902 (1969).
Dean, P. Visual acuity in hooded rats: effects of superior collicular or posterior neocortical lesions. Brain Res. 156, 17–31 (1978).
Dean, P. Grating detection and visual acuity after lesions of striate cortex in hooded rats. Exp. Brain Res. 43, 145–153 (1981).
Hong, Y. K., Lacefield, C. O., Rodgers, C. C. & Bruno, R. M. Sensation, movement and learning in the absence of barrel cortex. Nature 561, 542–546 (2018).
Wolff, S. B. & Olveczky, B. P. The promise and perils of causal circuit manipulations. Curr. Opin. Neurobiol. 49, 84–94 (2018).
Otchy, T. M. et al. Acute off-target effects of neural circuit manipulations. Nature 528, 358–363 (2015).
Rickgauer, J. P. & Tank, D. W. Two-photon excitation of channelrhodopsin-2 at saturation. Proc. Natl Acad. Sci. USA 106, 15025–15030 (2009).
Papagiakoumou, E. et al. Scanless two-photon excitation of channelrhodopsin-2. Nat. Methods 7, 848–854 (2010).
Andrasfalvy, B. K., Zemelman, B. V., Tang, J. & Vaziri, A. Two-photon single-cell optogenetic control of neuronal activity by sculpted light. Proc. Natl Acad. Sci. USA 107, 11981–11986 (2010).
Packer, A. M. et al. Two-photon optogenetics of dendritic spines and neural circuits. Nat. Methods 9, 1202–1205 (2012).
Prakash, R. et al. Two-photon optogenetic toolbox for fast inhibition, excitation and bistable modulation. Nat. Methods 9, 1171–1179 (2012).
Rickgauer, J. P., Deisseroth, K. & Tank, D. W. Simultaneous cellular-resolution optical perturbation and imaging of place cell firing fields. Nat. Neurosci. 17, 1816–1824 (2014).
Packer, A. M., Russell, L. E., Dalgleish, H. W. & Häusser, M. Simultaneous all-optical manipulation and recording of neural circuit activity with cellular resolution in vivo. Nat. Methods 12, 140–146 (2015).
Carrillo-Reid, L., Yang, W., Bando, Y., Peterka, D. S. & Yuste, R. Imprinting and recalling cortical ensembles. Science 353, 691–694 (2016).
Dal Maschio, M., Donovan, J. C., Helmbrecht, T. O. & Baier, H. Linking neurons to network function and behavior by two-photon holographic optogenetics and volumetric imaging. Neuron 94, 774–789 e775 (2017).
Yang, W., Carrillo-Reid, L., Bando, Y., Peterka, D. S. & Yuste, R. Simultaneous two-photon imaging and two-photon optogenetics of cortical circuits in three dimensions. Elife 7, https://doi.org/10.7554/eLife.32671. (2018).
Forli, A. et al. Two-photon bidirectional control and imaging of neuronal excitability with high spatial resolution in vivo. Cell Rep. 22, 3087–3098 (2018).
Mardinly, A. R. et al. Precise multimodal optical control of neural ensemble activity. Nat. Neurosci. 21, 881–893 (2018).
Vladimirov, N. et al. Brain-wide circuit interrogation at the cellular level guided by online analysis of neuronal function. Nat. Methods 15, 1117–1125 (2018).
Chen, I. W. et al. In vivo submillisecond two-photon optogenetics with temporally focused patterned light. J. Neurosci. 39, 3484–3497 (2019).
Russell, L. E. et al. All-optical interrogation of neural circuits in behaving mice. Nat. Protoc. 17, 1579–1620 (2022).
Chen, T. W. et al. Ultrasensitive fluorescent proteins for imaging neuronal activity. Nature 499, 295–300 (2013).
Madisen, L. et al. Transgenic mice for intersectional targeting of neural sensors and effectors with high specificity and performance. Neuron 85, 942–958 (2015).
Wekselblatt, J. B., Flister, E. D., Piscopo, D. M. & Niell, C. M. Large-scale imaging of cortical dynamics during sensory perception and behavior. J. Neurophysiol. 115, 2852–2866 (2016).
Yizhar, O. et al. Neocortical excitation/inhibition balance in information processing and social dysfunction. Nature 477, 171–178 (2011).
Chettih, S. N. & Harvey, C. D. Single-neuron perturbations reveal feature-specific competition in V1. Nature 567, 334–340 (2019).
Reimer, J. et al. Pupil fluctuations track rapid changes in adrenergic and cholinergic activity in cortex. Nat. Commun. 7, 13289 (2016).
Reimer, J. et al. Pupil fluctuations track fast switching of cortical states during quiet wakefulness. Neuron 84, 355–362 (2014).
Beaman, C. B., Eagleman, S. L. & Dragoi, V. Sensory coding accuracy and perceptual performance are improved during the desynchronized cortical state. Nat. Commun. 8, 1308 (2017).
Speed, A., Del Rosario, J., Burgess, C. P. & Haider, B. Cortical state fluctuations across layers of V1 during visual spatial perception. Cell Rep. 26, 2868–2874 e2863 (2019).
Jacobs, E. A. K., Steinmetz, N. A., Peters, A. J., Carandini, M. & Harris, K. D. Cortical state fluctuations during sensory decision making. Curr. Biol. 30, 4944–4955 e4947 (2020).
Pinto, L. et al. Fast modulation of visual perception by basal forebrain cholinergic neurons. Nat. Neurosci. 16, 1857–1863 (2013).
Lee, C. C. Y., Kheradpezhouh, E., Diamond, M. E. & Arabzadeh, E. State-dependent changes in perception and coding in the mouse somatosensory cortex. Cell Rep. 32, 108197 (2020).
Carandini, M. & Churchland, A. K. Probing perceptual decisions in rodents. Nat. Neurosci. 16, 824–831 (2013).
Bennett, C., Arroyo, S. & Hestrin, S. Subthreshold mechanisms underlying state-dependent modulation of visual responses. Neuron 80, 350–357 (2013).
Packer, A. M. & Yuste, R. Dense, unspecific connectivity of neocortical parvalbumin-positive interneurons: a canonical microcircuit for inhibition? J. Neurosci. 31, 13260–13271 (2011).
Jouhanneau, J. S., Kremkow, J. & Poulet, J. F. A. Single synaptic inputs drive high-precision action potentials in parvalbumin expressing GABA-ergic cortical neurons in vivo. Nat. Commun. 9, 1540 (2018).
Hofer, S. B. et al. Differential connectivity and response dynamics of excitatory and inhibitory neurons in visual cortex. Nat. Neurosci. 14, 1045–1052 (2011).
Kapfer, C., Glickfeld, L. L., Atallah, B. V. & Scanziani, M. Supralinear increase of recurrent inhibition during sparse activity in the somatosensory cortex. Nat. Neurosci. 10, 743–753 (2007).
Fino, E. & Yuste, R. Dense inhibitory connectivity in neocortex. Neuron 69, 1188–1203 (2011).
Silberberg, G. & Markram, H. Disynaptic inhibition between neocortical pyramidal cells mediated by Martinotti cells. Neuron 53, 735–746 (2007).
Yoshimura, Y., Dantzker, J. L. & Callaway, E. M. Excitatory cortical neurons form fine-scale functional networks. Nature 433, 868–873 (2005).
Haider, B., Häusser, M. & Carandini, M. Inhibition dominates sensory responses in the awake cortex. Nature 493, 97–100 (2013).
Ko, H. et al. Functional specificity of local synaptic connections in neocortical networks. Nature 473, 87–91 (2011).
Lee, W. C. et al. Anatomy and function of an excitatory network in the visual cortex. Nature 532, 370–374 (2016).
Harris, K. D. & Mrsic-Flogel, T. D. Cortical connectivity and sensory coding. Nature 503, 51–58 (2013).
Carrasco, M. Visual attention: the past 25 years. Vis. Res. 51, 1484–1525 (2011).
Ito, S. & Feldheim, D. A. The mouse superior colliculus: an emerging model for studying circuit formation and function. Front. Neural Circ. 12, 10 (2018).
Oliveira, A. F. & Yonehara, K. The mouse superior colliculus as a model system for investigating cell type-based mechanisms of visual motor transformation. Front. Neural Circ. 12, 59 (2018).
Sprague, J. M. Interaction of cortex and superior colliculus in mediation of visually guided behavior in the cat. Science 153, 1544–1547 (1966).
Froudarakis, E. et al. The visual cortex in context. Annu Rev. Vis. Sci. 5, 317–339 (2019).
Zhao, X., Liu, M. & Cang, J. Visual cortex modulates the magnitude but not the selectivity of looming-evoked responses in the superior colliculus of awake mice. Neuron 84, 202–213 (2014).
Liang, F. et al. Sensory cortical control of a visually induced arrest behavior via corticotectal projections. Neuron 86, 755–767 (2015).
Zingg, B. et al. AAV-mediated anterograde transsynaptic tagging: mapping corticocollicular input-defined neural pathways for defense behaviors. Neuron 93, 33–47 (2017).
Ruediger, S. & Scanziani, M. Learning speed and detection sensitivity controlled by distinct cortico-fugal neurons in visual cortex. Elife 9, https://doi.org/10.7554/eLife.59247. (2020).
Sridharan, D., Steinmetz, N. A., Moore, T. & Knudsen, E. I. Does the superior colliculus control perceptual sensitivity or choice bias during attention? Evidence from a multialternative decision framework. J. Neurosci. 37, 480–511 (2017).
Wang, L., McAlonan, K., Goldstein, S., Gerfen, C. R. & Krauzlis, R. J. A causal role for mouse superior colliculus in visual perceptual decision-making. J. Neurosci. 40, 3768–3782 (2020).
Bennett, C. et al. Higher-order thalamic circuits channel parallel streams of visual information in mice. Neuron 102, 477–492 e475 (2019).
Akam, T. & Kullmann, D. M. Oscillatory multiplexing of population codes for selective communication in the mammalian brain. Nat. Rev. Neurosci. 15, 111–122 (2014).
Stitt, I., Zhou, Z. C., Radtke-Schuller, S. & Frohlich, F. Arousal dependent modulation of thalamo-cortical functional interaction. Nat. Commun. 9, 2455 (2018).
McAdams, C. J. & Reid, R. C. Attention modulates the responses of simple cells in monkey primary visual cortex. J. Neurosci. 25, 11023–11033 (2005).
Moran, J. & Desimone, R. Selective attention gates visual processing in the extrastriate cortex. Science 229, 782–784 (1985).
McAdams, C. J. & Maunsell, J. H. Effects of attention on orientation-tuning functions of single neurons in macaque cortical area V4. J. Neurosci. 19, 431–441 (1999).
van Vugt, B. et al. The threshold for conscious report: Signal loss and response bias in visual and frontal cortex. Science 360, 537–542 (2018).
Znamenskiy, P. et al. Functional selectivity and specific connectivity of inhibitory neurons in primary visual cortex. bioRxiv, 294835, https://doi.org/10.1101/294835. (2018).
Cai, B. et al. Modeling robust and efficient coding in the mouse primary visual cortex using computational perturbations. bioRxiv, https://doi.org/10.1101/2020.04.21.051268. (2020).
Tsodyks, M. V., Skaggs, W. E., Sejnowski, T. J. & McNaughton, B. L. Paradoxical effects of external modulation of inhibitory interneurons. J. Neurosci. 17, 4382–4388 (1997).
Olshausen, B. A. & Field, D. J. Sparse coding of sensory inputs. Curr. Opin. Neurobiol. 14, 481–487 (2004).
Ozeki, H., Finn, I. M., Schaffer, E. S., Miller, K. D. & Ferster, D. Inhibitory stabilization of the cortical network underlies visual surround suppression. Neuron 62, 578–592 (2009).
Ahmadian, Y., Rubin, D. B. & Miller, K. D. Analysis of the stabilized supralinear network. Neural Comput. 25, 1994–2037 (2013).
Rubin, D. B., Van Hooser, S. D. & Miller, K. D. The stabilized supralinear network: a unifying circuit motif underlying multi-input integration in sensory cortex. Neuron 85, 402–417 (2015).
Sadeh, S. & Clopath, C. Inhibitory stabilization and cortical computation. Nat. Rev. Neurosci. 22, 21–37 (2021).
Knudsen, E. I. Neural circuits that mediate selective attention: a comparative perspective. Trends Neurosci. 41, 789–805 (2018).
Quintana, D. et al. Dissociating instructive from permissive roles of brain circuits with reversible neural activity manipulations. bioRxiv, https://doi.org/10.1101/2023.05.11.540397. (2023).
Steinmetz, N. A., Zatka-Haas, P., Carandini, M. & Harris, K. D. Distributed coding of choice, action and engagement across the mouse brain. Nature 576, 266–273 (2019).
Allen, W. E. et al. Thirst regulates motivated behavior through modulation of brainwide neural population dynamics. Science 364, 253 (2019).
Urai, A. E., Doiron, B., Leifer, A. M. & Churchland, A. K. Large-scale neural recordings call for new insights to link brain and behavior. Nat. Neurosci. 25, 11–19 (2022).
Steinmetz, N. A. et al. Aberrant cortical activity in multiple GCaMP6-expressing transgenic mouse lines. eNeuro 4, https://doi.org/10.1523/ENEURO.0207-17.2017 (2017).
Peirce, J. W. Generating stimuli for neuroscience using PsychoPy. Front. Neuroinform. 2, 10 (2008).
Marshel, J. H., Garrett, M. E., Nauhaus, I. & Callaway, E. M. Functional specialization of seven mouse visual cortical areas. Neuron 72, 1040–1054 (2011).
Galinanes, G. L. et al. Optical alignment device for two-photon microscopy. Biomed. Opt. Express 9, 3624–3639, (2018).
Roome, C. J. & Kuhn, B. Chronic cranial window with access port for repeated cellular manipulations, drug application, and electrophysiology. Front. Cell Neurosci. 8, 379 (2014).
Chadderton, P., Margrie, T. W. & Häusser, M. Integration of quanta in cerebellar granule cells during sensory processing. Nature 428, 856–860 (2004).
Mateo, C. et al. In vivo optogenetic stimulation of neocortical excitatory neurons drives brain-state-dependent inhibition. Curr. Biol. 21, 1593–1602 (2011).
Stringer, C., Wang, T., Michaelos, M. & Pachitariu, M. Cellpose: a generalist algorithm for cellular segmentation. Nat. Methods 18, 100–106 (2021).
Watson, B. O., Yuste, R. & Packer, A. M. PackIO and EphysViewer: software tools for acquisition and analysis of neuroscience data. bioRxiv, 054080, https://doi.org/10.1101/054080. (2016).
Pachitariu, M. et al. Suite2p: beyond 10,000 neurons with standard two-photon microscopy. bioRxiv, 061507, https://doi.org/10.1101/061507. (2017).
Mathis, A. et al. DeepLabCut: markerless pose estimation of user-defined body parts with deep learning. Nat. Neurosci. 21, 1281–1289 (2018).
Friedrich, J., Zhou, P. & Paninski, L. Fast online deconvolution of calcium imaging data. PLoS Comput. Biol. 13, e1005423 (2017).
Schütt, H. H., Harmeling, S., Macke, J. H. & Wichmann, F. A. Painfree and accurate Bayesian estimation of psychometric functions for (potentially) overdispersed data. Vis. Res. 122, 105–123 (2016).
Acknowledgements
We thank Matteo Carandini, Beverley Clark, Kenneth Harris, Nick Steinmetz and Arnd Roth for helpful discussions and comments on the manuscript; Brendan Bicknell, Dustin Herrmann and Evelyn Wong for comments on the manuscript; Soyon Chun and Agnieszka Jucht for mouse breeding; Maite Marcantoni and Florence Bui for behavioral training during pilot experiments; Arthur Gretton and Peter Latham for analysis advice; and Bruker Corporation for technical support. This work was supported by grants from the Wellcome Trust (PRF 201225 and 224688), ERC (AdG 695709), MRC (MR/T022922/1) and the BBSRC (BB/N009835/1).
Author information
Authors and Affiliations
Contributions
L.E.R. performed surgeries, trained animals, performed experiments, and analyzed data. Z.Y. and L.P.T. trained animals. M.F. and A.M.P. provided valuable advice about experimental design and analysis. L.E.R. built hardware and software for control of behavioral training and for calibration and control of photostimulation. H.W.P.D. helped build photostimulation control software. S.C. and C.D.H. provided C1V1-Kv2.1 virus. L.E.R. and M.H. conceived and designed the study and wrote the manuscript with input from all authors.
Corresponding author
Ethics declarations
Competing interests
The authors declare no competing interests.
Peer review
Peer review information
Nature Communications thanks the anonymous reviewers for their contribution to the peer review of this work. A peer review file is available.
Additional information
Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Supplementary information
Source data
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
About this article
Cite this article
Russell, L.E., Fişek, M., Yang, Z. et al. The influence of cortical activity on perception depends on behavioral state and sensory context. Nat Commun 15, 2456 (2024). https://doi.org/10.1038/s41467-024-46484-5
Received:
Accepted:
Published:
DOI: https://doi.org/10.1038/s41467-024-46484-5
Comments
By submitting a comment you agree to abide by our Terms and Community Guidelines. If you find something abusive or that does not comply with our terms or guidelines please flag it as inappropriate.