Why characterize attention in depression?

Cognitive dysfunction is included as a diagnostic criterion for major depressive disorder (MDD), described as “Diminished ability to think or concentrate”1. These cognitive problems may include impairments in executive functions, learning and memory, processing speed, as well as in concentration and attention2, and these are associated with a disproportionately poor prognosis in psychosocial and occupational domains. Attention impairments in particular are known to negatively impact daily function3,4 and are associated with poorer clinical outcome5. Although cognitive dysfunction is a hallmark of MDD contributing to disability, it is not well understood, especially in comparison with the mood features of MDD.

Our focus on attention as a specific cognitive domain aligns with the matrix of cognitive neurobiological constructs within the Research Domain Criteria (RDoC) framework, intended to advance a precision medicine approach to psychiatry informed by neurobiology6. The cognitive systems domain of RDoC comprises a construct of attention as well as constructs of perception, declarative memory, language, cognitive control, and working memory7. RDoC also includes both the negative and positive valence systems and associated constructs of emotional function and mood. Thus, the RDoC matrix provides a framework from which to hone in on attentional impairments in MDD and to consider how attention may influence other domains of the cognitive and emotional systems. Although it is often assumed that group-level deficits in a variety of cognitive tasks8 imply that depressed individuals experience a “general” cognitive deficit, there is not yet sufficient evidence to suggest that an individual with deficits on one cognitive task will necessarily exhibit deficits on other tasks. Therefore, we focus on parsing impairments in specific sub-domains of goal-directed attention, and outline directions for future research to parse individual cognitive impairments with further granularity.

In the following review, we hone in on dysfunction in attention, which may encompass several sub-domains of patient experiences, from increased distractibility (selective attention impairment) to an inability to sustain focus (sustained attention impairment) or an inability to simultaneously monitor multiple channels of information (divided attention impairment). We seek to characterize these specific types of attention impairments in MDD and their neurobiological correlates, informed by current insights from cognitive neuroscience and the current state of knowledge about such impairments in MDD and the mechanisms by which they might develop and impact other areas of cognitive and emotional function. Given that attention and other cognitive dysfunctions in MDD are associated with poor outcomes following treatment with current standard-of-care interventions, we also review potential alternative treatments to address attention impairment with greater precision and consider the case for longitudinal studies.

Domains of goal-directed attention and impairments in MDD

We operationalize top–down attention, defined within cognitive neuroscience as “guidance of attention based on prior knowledge, willful plans, and current goals”9. Here, we review comparisons of behavioral performance between MDD patients and healthy controls on three sub-domains of top–down attention (selective, sustained, and divided attention) defined below. In particular, we focus exclusively on studies using neutral stimuli (e.g. arrowheads) to probe top–down attention capabilities, to complement the extensive literature regarding emotionally guided attention and negative attentional biases (discussed separately in section “Attention and the negative valence system”; for discussion of internally vs. externally guided attention see Supplementary Appendix). Given that MDD is a highly heterogeneous disorder10, we expect that attention impairments may not always be present for all depressed individuals, supported by the observation that not all depressed individuals endorse attention problems as a core symptom. Thus, we take a relatively conservative approach to uncovering attention impairments by reviewing findings at the group level.

Selective attention impairment

Selective attention is the ability to attend to important (task-relevant) information while ignoring distracting (task-irrelevant) information. Over a century ago, this dual process of simultaneously attending and ignoring was proposed to be essential for parsing the overload of input to our perceptual systems11 and a breakdown of these processes would result in a more distracted, overwhelmed, and confused state. Within the cognitive neuroscience literature, selective attention is discussed a multifaceted construct in of itself. One important distinction is whether task-relevance is defined by stimulus features or by its spatial location. Feature-based selective attention occurs when one attends to a particular aspect of the available information (e.g., color or shape), while ignoring other, irrelevant features. Spatial selective attention occurs when one must attend to information in a particular area of space (e.g., left or right visual field) while ignoring information in irrelevant locations. Whether these two forms of selective attention rely on dissociable neural mechanisms has been a controversial topic among neuroscientists studying visual attention. Some researchers have argued that feature-based and spatial selective attention involve activity modulation in similar regions12 and result in additive modulation when used together13. Others have demonstrated slight differences in the particular sub-regions of frontal and parietal cortices modulated by these subtypes of selective attention14,15. Clinical studies aimed at understanding the distinct and overlapping behavioral and neural phenotypes of feature-based and spatial selective attention impairments may provide further insight into this key question in cognitive neuroscience in addition to clarifying the specific impairments observed in depression.

Many investigations of color-word reaction times have revealed that unmedicated depressed individuals (including children, adolescents, and adults) perform worse than healthy controls when required to attend to a task-relevant feature (e.g., color) while simultaneously ignoring a distracting feature (e.g., semantic meaning)16,17,18,19, including in an international sample of n = 1008 MDD participants without co-morbid ADHD20. Importantly, these studies measured response times in either the word- or color-naming condition independently to capture feature-based selective attention abilities, rather than measuring inhibitory control using the classic measure of “interference” calculated by subtracting reaction times in the color-naming condition from those in the word-naming condition. More often than not, studies investigating spatial selective attention by using the Flanker task (identifying the direction of an arrow flanked by distracting arrows) do not observe a significant difference in performance between unmedicated MDD patients and healthy controls21,22. This might suggest that selective attention impairments in MDD are specific to feature-based selective attention while spatial selective attention remains intact which could imply that (1) not all of the underlying neural mechanisms are shared between feature-based and spatial selective attention and (2) cognitive impairments in the context of depression may not be as homogeneous as previously thought. However, it is important to note that the Flanker task used to assess spatial attention tends to be much less challenging than color- and word-naming tasks, with performance close to ceiling. Future investigations could use equally difficult versions of feature- and spatial-selective attention tasks to determine whether this deficit is truly specific to the feature-based case.

Sustained attention impairment

Sustained attention refers to the ability to continuously attend to, or monitor, task-relevant information, usually assessed in the relative absence of distractions (e.g. in a quiet testing room). When sustained attention is impaired, one might report an inability to maintain focus at work or school. To assess the degree to which someone is sustaining attention on a set of stimuli, researchers most commonly present intermittent “oddball” stimuli (e.g., the Continuous Performance Task) and measure the speed of detection. It is assumed that subjects who successfully engage in sustained attention will have faster hits and fewer misses when these oddball stimuli appear. Studies comparing oddball task performance by unmedicated MDD patients versus healthy controls often find that MDD patients respond more slowly than controls23,24,25, suggesting that even in the absence of overt distraction attentional focus is impaired. However, other studies in both adolescents and adults fail to find a significant difference in oddball reaction times between MDD patients and healthy controls26,27. Future meta-analyses may reveal whether these differences reflect true inter-subject heterogeneity or measurement variability. Recent work investigating the impact of rumination on cognition in depressed individuals has proposed that rumination may underlie observed deficits in sustained attention by competing for attentional resources28. Similarly, excessive worrying in the context of depression or anxiety may also sap attentional resources. In our own data, we do not observe a significant correlation between selective attention and worrying20, but future work may reveal whether excessive worrying and/or rumination contribute to inter/intra-individual differences in sustained attention deficits.

Divided attention impairment

Divided attention refers collectively to the functions of multi-tasking and of simultaneously attending to multiple sources of task-relevant information, which is often critical for efficient daily functioning. One of the first controlled studies of divided attention used a dichotic listening task to demonstrate the inherent difficulty of parsing two different messages delivered to each ear simultaneously29. Although the challenges of divided attention have since been documented extensively30, it was not until lesion studies in the 1990s that the neural basis of these difficulties began to be understood. These early neuroscience studies demonstrated the importance of the prefrontal cortex in attending to multiple sensory modalities simultaneously31, performing multiple tasks simultaneously32, and switching attention among tasks33. In the context of MDD, studies of unmedicated adults have revealed impairments in performing two tasks simultaneously34, and simultaneously attending both auditory and visual stimuli to detect targets35 compared with healthy controls. Better divided attention appears to predict treatment efficacy independently of baseline depression severity36 while poorer divided attention is associated with delayed response and increased risk of relapse5 as well as higher suicidality37.

Attention and cognition

The ability to allocate one’s attention volitionally in selective, sustained, or divided attention contexts is critical for a variety of cognitive tasks. Thus, impairments of goal-directed attention may have downstream effects on other functions, such as cognitive control, perception, and decision-making. Here, we discuss the relationship between top–down attention and other cognitive domains to shed light on their interactions in depression.

Cognitive control

Much work has been done to advance our understanding of how various cognitive control functions may change in the context of MDD (for review38,39). According to the RDoC working group40, the broad construct of “cognitive control” encompasses sub-functions, such as the selection and updating of goal representations, response selection and suppression, and performance monitoring. These cognitive control functions are also known in the neurocognitive literature as aspects of executive function, and many of these sub-functions also involve aspects of top–down attention (e.g., re-allocation of attention toward goal-relevant information upon encountering feedback of a performance error). The RDoC working group states that “cognitive control most often requires attentional processes, and thus cognitive control tasks also test attention” and similarly, Chun et al.41 state that “To the extent that there are limitations in the number of alternatives that can be considered at any given time— and even broader set of responses and choices that can be made to these alternatives—cognitive control is intrinsically attentional.” Moreover, cognitive control processes and top–down attention often share overlapping neural circuitry (see the section “Putative mechanisms of attention impairment”).

This close relationship between attention and cognitive control presents a challenge for characterizing specific deficits and their biological substrates in psychiatric illness. To thoroughly examine top–down attention as our construct of interest, we focus exclusively on studies which measure participants’ ability to allocate attention volitionally in selective, sustained, or divided attention contexts (as operationalized in the section “Domains of goal-directed attention”), regardless of whether these functions are referred to in the literature as top–down attention, cognitive control, or executive functioning. In particular, when examining studies of feature-based selective attention using the Stroop task, we limit our review to studies using raw reaction time measurements with either color or word stimuli independently (in accordance with the definition of feature-based selective attention as attending relevant information while ignoring distraction) and exclude studies which use interference scores calculated by subtracting reaction times in two conditions (conceptualized as a measure of cognitive control). Further discussion of the behavioral and neurobiological distinctions between top–down attention and cognitive control can be found elsewhere42.

Perception

Once considered a bottleneck limiting sensory processing, attention is now appreciated as a critical gating mechanism for sensory perception, helping to form our fluid, organized, conscious experience from the abundant information bombarding our sensory systems. More recently, cognitive psychological studies have characterized the particular ways, in which attention can influence perception, from low-level visual features to high-level perceptual judgments (for review see ref. 43). For example, a large body of research has demonstrated that both automatic attention and voluntary attention to particular visual stimuli increases perception of both contrast and color saturation44,45. Higher-level perceptual features such as the attractiveness46 or the intensity of emotional expression in a face47 are also altered by attention. The finding that attention substantially alters the overall appearance of sensory information suggests that attention impairments may impact perception. These findings are in accordance with research at the sensory level, where dramatic differences in retinal contrast gain have been observed between depressed individuals and healthy controls48. Barbot and Carrasco49 recently showed that emotion and trait anxiety moderate the effect of attention on perceived contrast, motivating future studies to provide more detailed characterizations of interactions between depressive symptoms and the effect of attention impairments on perception.

Decision-making

In addition to the profound influence of attention on perception, attention also appears to play a critical role in decision-making. A plethora of research has revealed that we are more likely to choose options that we have attended to for longer regardless of the subjective value of those options (for review see ref. 50). Indeed, gaze bias appears to both reflect and influence preferences51, and reward-learning in turn affects the allocation of attention with strong biases toward previously rewarded locations52. Recent advances using computational modeling of behavior have shown that selective attention is a requirement for effective multidimensional reinforcement learning53, and that attention influences the choices we make as well as our learning of reward associations over time54. Efforts in computational psychiatry have sought to characterize changes in decision-making processes in the context of depression and have revealed dysfunction in the processes underlying model-based decision-making (for review see ref. 55). Future work may further elucidate the reciprocal interplay of impaired attention and decision-making in depression by leveraging knowledge across disciplines.

Attention and the negative valence system

Negative attentional biases

Negative attentional biases offer one way to consider how attention (within the cognitive system of RDoC) may interact with processes within the Negative Valence system. A large body of research, largely utilizing the Dot Probe56 or Emotional Stroop Task57 has shown that depressed patients tend to spend more time attending to negative information, such as sad faces than neutral or positive information (for review see refs. 58,59). Generally speaking, depressed patients’ attention tends to linger longer on negative information such as sad faces than healthy controls do, which suggests a negative bias in the way that depressed individuals sample information from their environments. However, given the substantial evidence (reviewed above) that depressed individuals suffer from attention dysfunction in neutral contexts compared with healthy controls, characterizations exclusively focusing on negative biases in attention do not provide a full picture. It is plausible that these negative attentional biases are exacerbated by general impairment of top–down attention allocation (and re-allocation toward goal-relevant information when something has inadvertently caught one’s attention). In other words, depressed individuals may have their attention initially captured by salient negative information60, but the lingering of attention on this negative information may be due to an inability to re-orient attention away from distracting negative information toward goal-relevant information. Given that persistent low mood may be perpetuated61 by negative attentional biases in MDD breaking this cycle of attention-related biases has the potential to yield substantial improvement in wide-ranging symptomatology and overall quality of life.

Mood and attention

Conversely, mood states may also influence the ways that we allocate attention, whether consciously or unconsciously. Theories about how attention changes in different mood states have been wide-ranging (for review see ref. 62). Some suggest that positive mood leads to decreased attention and cognitive effort. For example, the “mood-as-input” theory63 postulates that positive mood makes tasks more enjoyable and could render subjects more easily satisfied with lower performance. Other theories suggest that positive mood actually improves attention, while low mood is associated with worse attention. For example, the “broaden-and-build” theory64 suggests that positive mood enhances cognition by making it broader, emphasizing increased creativity65, and flexible thinking66, as well as broadened attentional scope67. This model is in accordance with early theories of negative arousal narrowing attentional focus68, such as heightened attention to a threatening weapon and diminished attention to other details69. Brand et al.70 showed that mood induction (e.g., euphoric or distressing film fragments) in healthy adults could influence selective attention abilities on a nonemotional Stroop task, in accordance with the findings reviewed above of diminished selective attention performance in MDD patients. These findings suggest that, as theorized by the “affect-as-information” framework71, emotional/mood states can influence attention and overall cognitive styles, thus low mood observed in depression may contribute to attentional impairments and attentional impairments may in turn perpetuate low mood.

Putative mechanisms of attention impairment

Neural circuits

Understanding the biological correlates of attention allocation and the ways in which they may become disrupted is a critical first step toward developing more targeted treatments. Neuroimaging studies using a variety of attention tasks14 have identified a common network, often referred to as the fronto-parietal attention/control network72,73,74. This network, which includes areas such as the frontal eye fields, intra-parietal sulcus, medial prefrontal cortex (mPFC), and superior parietal lobule, appears to be critical for the control of goal-directed attention75 with increasing involvement for more difficult tasks76. In our own work, we have directly linked hypoconnectivity within this fronto-parietal attention network to poorer goal-directed attention in MDD20.

More recently, studies using unbiased data-driven approaches (complementing hypothesis-driven studies) have independently confirmed the importance of the fronto-parietal attention network for coordinating cognition and its dysfunction in MDD using graph theoretical metrics applied to analyses of network activity. Specifically, Bassett et al.77 showed that flexibility of nodes within this network predicts subjects’ learning of a simple motor task with visual cues at a future time point, and Gu et al.78 demonstrated that the fronto-parietal attention network has high modal controllability, meaning that it is capable of affecting a wider range of possible neural states, including harder-to-reach states, than other large-scale networks (in line with the overarching finding that regions with a large number of long distance connections tend to be optimal controllers79. These findings begin to provide a mechanistic account for how attention may influence a variety of important functions. Studies of topological organization in depressed individuals using anatomical80, resting-state81, and task-based functional connectivity82 have found disruption to this same fronto-parietal network. This developing literature suggests that changes to fronto-parietal network function in the context of depression may represent an under-investigated target for the development of symptom-specific treatment.

It is important to acknowledge that even with significant advances in neurobiological studies of attention, our understanding of precisely which neural systems support which sub-functions of attention remains murky. Recent data-driven approaches to unpacking subnetworks within the fronto-parietal attention system have yielded varying results, often accompanied by even more variable naming schemes83,84. Many attempts have been made to distinguish these subnetworks based on independent sub-functions of attention, such as a “dorsal attention network” involved in top–down allocation and a “ventral attention network” involved in bottom-up re-orienting, or a “central executive network” involved in goal-oriented attention and a “cingulo-opercular network” involved in salience-driven attention. These distinctions, however, remain unclear and inconsistent, with more recent studies providing counter-evidence to this dogma of distinct attention networks for bottom-up and top–down processing85. Even with the increasing number of studies using data-driven approaches to disentangle attention-related subnetworks, most of these approaches assume network independence and orthogonality rather than addressing the potential for spatial overlap in network architecture. Studies that attempt to capture the complexity of the attention system have taken various approaches such as accounting for time-varying changes in network arrangement by task state86 or in fast alternating rhythms within a task87, addressing differences in network configuration during attention to distinct sensory modalities88,89. In order to develop clinically applicable biomarkers of this dynamic and complex attentional system, we will need to take into account variability on all fronts, from individual differences to task-related differences to temporal reconfigurations.

Oscillatory synchrony

With abundant sensory information all around, the neurobiological mechanisms supporting ignoring of irrelevant information may be just as important as those supporting attentional orienting and focus. Electroencephalography (EEG) studies have shown that cortical oscillations in the alpha band (8–14 Hz), previously considered an “idling” rhythm increasing in power at rest90, has more recently been linked to ignoring of task-irrelevant information (for review see ref. 91). Increased power in the alpha band has been linked to suppression of sensory signals92,93, and appears to play a causal role in suppressing the intrusion of distracting information when applied via transcranial magnetic stimulation during a visual target detection task94. Fronto-central theta oscillations (4–7 Hz) appear to play a more executive role in orchestrating the guidance of top–down attention toward task-relevant information and switching the focus of attention among various stimuli (for review see ref. 95). In support of this theory, fronto-central theta power has been associated with novelty detection and goal-directed responses96, as well as divided attention between conflicting visual and auditory signals97. Together, alpha and theta oscillations appear to support normal attention function and may be a potential substrate of attention dysfunction in disorders such as depression.

Combining information across neuroimaging and electrophysiological studies, it has been shown that the strength of theta synchrony within the fronto-parietal attention network can predict goal-directed attention behavior98 and thus may be a potential candidate for investigating the neural substrates of attention dysfunction in depression. Recent evidence has shown that fronto-parietal theta appears to set a clocking rhythm for oscillations at other frequencies such as alpha during rhythmic alternations between attention and ignoring—a phenomenon referred to as “theta-dependent” perceptual sampling87. Further investigations of these spatio-temporal dynamics may reveal important neurobiological substrates for the development of targeted treatments for attention impairments in depression.

Improving attention

Treatment trials in depression have largely focused on clinical measures of response rather than on behavioral measures of attention and very few report item-level data, making it challenging to parse out specific changes in attention. Here, we review potential pharmacological, brain stimulation, and behavioral interventions and their effects on sub-domains of goal-directed attention (selective, sustained, and divided) in the currently available literature.

Pharmacological interventions

Unfortunately, selective serotonin reuptake inhibitors (SSRIs), currently the first-line class of pharmacologic treatment for depression, have generally not been shown to improve attention. Many studies, albeit largely observational or with a mix of medications used, show no change in sustained attention25,99, selective attention100, or divided attention5,101, despite improvements in mood symptoms. A recent systematic review of healthy individuals even found evidence for worsening of divided and sustained attention with SSRI treatment102. Vortioxetine, a relatively newer serotonin reuptake inhibitor (SRI) with mixed agonist/antagonist/partial-agonist effects, is a notable exception that has demonstrated benefits for sustained and selective attention as well as depressive symptoms in adults with MDD103,104.

Catecholaminergic agents, some of which are approved for depression but many of which have been approved for other disorders such as attention-deficit/hyperactivity disorder (ADHD), have a much stronger evidence base for improving attention and other aspects of cognition. We briefly review findings regarding the effects of some of the more commonly used of these medications on attention. While the potential side effects, including abuse, must be managed carefully, future studies should investigate the ability of agents such as psychostimulants or modafinil to improve attention in depression as a clinical target with important functional implications in its own right that are not well addressed by SSRIs. Norepinephrine (NE) and dopamine (DA) are believed to modulate attention and other cognitive capabilities105, generally with dose–response following classic inverted-U shaped curves. As with MDD, persons with ADHD frequently have deficits in goal-directed attention. Psychostimulants, which increase levels of both NE and DA in the striatum and cortex, remain the first-line treatment for ADHD and have also been used as an adjunctive therapy in MDD despite limited support from high quality, randomized, controlled trials that examined mood-related symptom improvement alone106.

There is strong evidence that dopaminergic agents can improve sustained attention, and possibly selective and divided attention. This weight of evidence toward sustained attention may in part be an artifact of historical focus on sustained attention as the primary cognitive domain affected in persons with ADHD. Both methylphenidate107,108 and amphetamine109,110 have been shown consistently to improve sustained attention in both healthy adults and youth and adults with ADHD. Similarly, methylphenidate111,112 and amphetamine113 have been demonstrated to improve selective attention, while methylphenidate has been shown to improve divided attention111. Bupropion, a NE and DA reuptake inhibitor approved as an antidepressant has likewise been shown to improve sustained attention in youth with ADHD114 and healthy adults115. Modafinil, thought primarily to act as a weak dopamine reuptake inhibitor, has been shown in several randomized trials to be effective for symptoms of MDD as adjunctive treatment (reviewed in ref. 106). It can improve both sustained and selective attention in healthy subjects116,117, enhance sustained attention in ADHD patients118, and increase feature-based selective attention and clinical symptoms in depressed patients119. Finally, the D3 agonist pramipexole has recently been shown to improve sustained attention, particularly in baseline low performers120.

Selective noradrenergic agonists, including serotonin and norepinephrine reuptake inhibitors (SNRIs), atomoxetine, and the alpha2 agonists have been shown to improve selective121 and sustained attention in some studies103,122, but not all123,124. Some studies have found improvements in selective attention with venlafaxine122 and with duloxetine125, but the results have been mixed99,100. Guanfacine, a direct alpha2a agonist, has been shown to improve feature-based selective attention in at least one study of 17 adults with ADHD126. Noradrenaline is a major component of the inverted-U shaped physiologic response to stress, and alpha2a adrenergic receptors in the prefrontal cortex127 mediate many aspects of cognitive control128. Differences in noradrenergically mediated stress responses, as well as individual differences in response to acute and chronic stress may in part explain why findings are more mixed regarding attention than for dopaminergic agonists in these varied populations. With the importance of chronic stress in depression (section “Stress reactivity and attention”) in mind, future studies might investigate catecholaminergic agents as adjunctive treatment in depression-considering stress and baseline cognitive performance as potential moderator effects–and focus on cognitive outcomes independently of clinical rating scales.

Brain stimulation

Another possibility for targeted intervention is utilizing noninvasive brain stimulation, including transcranial magnetic stimulation (TMS) and transcranial direct-current stimulation (tDCS). Optimal targets and protocols remain an active area of research, but these interventions are appealing for their potential to target specific anatomic regions and thereby alter network functionality more precisely than chemical neuromodulators129. The most common target in MDD has been left dorsolateral prefrontal cortex (DLPFC), which is generally considered to be a node in the cognitive control network, albeit one that interfaces with the fronto-parietal attention network130. TMS and tDCS have been evidenced to improve both selective attention via targeting the DLPFC in healthy controls131 and sustained attention in depressed adults132. Multiple studies have shown successful sustained attentional enhancements in depressed samples via TMS133, rTMS134, and tDCS135.

However, a recent systematic review found it difficult to draw firm conclusions regarding effects of brain stimulation on attention in depression with several studies not finding effects on Stroop tasks or sustained attention136. This finding is in accordance with prior systematic reviews of DLPFC stimulation effects on cognition across psychiatric disorders which found no improvement in attention domains (including selective, sustained, and divided) and only found significant improvements in working memory137 and verbal memory138. Given that the vast majority of studies investigating cognition in mental illness utilize the DLPFC as a target, it is perhaps unsurprising that functions more closely associated with the DLPFC such as working memory have been observed more readily. It remains unknown whether stimulation of more precise targets in the fronto-parietal network would yield more specific changes in attention function.

Behavioral interventions

Varied attempts have also been made to improve attention via behavioral interventions, though often outside the context of mental illness. An increasing body of data indicate that physical exercise has preventative and therapeutic effects for mood-related symptoms of depression139. Physical exercise has also been demonstrated to improve selective attention behaviorally in both healthy controls140 as well as young adult141 and older adult MDD patients142, though these effects appeared to be short-lived and recent reviews find mixed results143,144. Mindfulness meditation (MM) involves rehearsing the skill of selectively attending to one sensation (e.g., breath) while ignoring distracting thoughts145, and has been shown to improve selective attention measured independently146 in addition to improving mood-related symptoms of MDD147,148 and increasing alpha oscillations149. A systematic review analyzing 23 studies indicated improvements in selective attention through MM in various nonclinical and clinical samples146. More recent studies using neurofeedback-assisted technology-supported mindfulness training (N-tsMT) have revealed selective attention improvement and enhancement in well-being in healthy individuals150. Additional research on MM (in person and virtually) in MDD patients with attention impairments in particular is warranted.

Computerized cognitive training apps, which involve cognitive exercises or immersive video games, have also been studied as potential treatments for attentional and mood-related symptoms in MDD151. Anguera et al.152 found that older adults with MDD exhibited improved sustained attention and mood-related symptoms after four weeks of a mobile cognitive intervention app, and a meta-analysis found that computerized training improved mood symptoms and attention153. Notably, the benefits of computerized cognitive training may generalize beyond the specific cognitive tasks practiced, such as improving untrained measures of attention, reducing negativity bias, and enhancing daily functioning, suggesting far transfer effects152,153. Additional work will be needed to fully assess far transfer of these various interventions, measuring generalizability of behavioral performance on a wider variety of tasks. Furthermore, these apps have not always outperformed control interventions and tend to have high drop-out rates that are worse with more severe depression154. Nonetheless, given that computerized treatments are generally inexpensive, noninvasive, and can be tailored to the individual, these interventions warrant further research.

Future directions

Stress reactivity and attention

One theory is that pathology associated with MDD (e.g., neuromodulatory changes, stress-related pathology, etc.) leads to attention impairments as a downstream effect. Here, we describe one potential theoretical model for how stress hyperreactivity associated with recurrent depression might interfere with the fronto-parietal attention network. The canonical stress pathway, involving the hypothalamus–pituitary–adrenal axis, exhibits dysregulation in depression (for review ref. 155) associated with increased cortisol156 and subsequent neuronal atrophy in the hippocampus157 and mPFC158. Given the importance of hippocampal-mPFC communication via theta rhythms for a variety of cognitive functions involving top–down attention95,159,160, it is possible that stress hyperreactivity could underlie attention impairments by disrupting this mechanism161.

Moreover, the fronto-parietal attention network (which also synchronizes in the theta band98); includes areas of the mPFC. It is theoretically plausible that these systems work together in healthy adults to translate task/goal representations into attentional control over sensory processing by means of theta synchronization, and that disruption to the mPFC by stress hyperreactivity disrupts this pathway. This model would predict that patients experiencing stress hyperreactivity would also be more likely to have impaired attention. The mPFC is not only a key node of the fronto-parietal attention network; it is also a key hub for emotional processing. Researchers have therefore suggested that the mPFC may act as a cognitive/emotional integration site162—a plausible locus for the influence of emotion on guiding attention and vice versa. Given that the mPFC is known to undergo neurodegeneration in the context of stress hyper-reactivity in depression158, it may be that as this critical hub region deteriorates both cognitive and emotional symptoms arise together.

The case for longitudinal studies

Future studies may address whether attention problems are a cause or consequence of mood-related symptoms of depression by characterizing longitudinal trajectories. One possibility is that attention impairments precede, and are a risk factor for developing depression: impaired goal-directed attention could reduce the likelihood of achieving one’s goals, which could lead to lower estimation of one’s capacity to achieve rewards or more frequent failure to avoid punishments, all of which could contribute to the mood-related symptoms of depression. Given that attention abilities vary naturally in the population, it may be the case that those who struggle with attention relative to their peers are at increased risk of developing depression and could benefit from preventative measures to improve attention. Alternatively, it is possible that attention impairments develop in the context of MDD and may have a role in its chronicity. For example, excessive rumination/worry might sap attentional resources, lack of sufficient sleep could produce attention deficits, or extreme anhedonia could lead to general psychomotor slowing. However, it should be noted that these symptoms of depression are often ameliorated with no improvement in attention (25,100 see the section “Improving attention”) and our own data suggest that selective attention function does not correlate with insomnia or excessive worrying20.

Giollabhui et al.163 used a longitudinal sample to investigate selective, sustained, and divided attention, and showed that the interactions among these pathways may be complex. Specifically, baseline-divided attention performance predicted depressive symptoms at follow-up, but higher depressive symptoms at baseline also predicted worse selective attention at follow-up, and childhood stress predicted both higher depressive symptoms and worse attention. Future studies will be essential to probe the relationship between attention and other aspects of depression in greater detail, including the potential role of attention impairments in other clinical profiles associated with poorer prognosis (e.g., anhedonia). Longitudinal studies may also reveal whether attention impairments operate as state or trait-like features of depression, whether attention behaviors co-vary with neurobiological changes over time and how these change with targeted intervention, and whether targeting attention impairments will ultimately improve mood-related symptoms and reduce the burden of MDD.

Conclusion: getting personal

As the field of precision psychiatry develops, it is becoming increasingly important to understand the nuances of individual variability to develop personalized treatments164,165. Cognitive features of psychiatric disorders should not be overlooked in this regard, and more work should be done to determine which individuals would be most likely to benefit from treatments targeting cognitive symptoms like attention impairments. Pinning down precise neurobiological targets using a combination of hypothesis-driven and data-driven approaches will be essential for achieving the overarching goal of developing effective, personalized treatments for attention trans-diagnostically. At present, the best available evidence suggests that symptoms of depression and attention do not generally improve together for most patients with current treatments, particularly SSRIs. Drugs targeting catecholamines (e.g., DA, NE) may benefit sustained attention, but it remains unknown whether these interventions target the specific neural circuit or electrophysiological correlates of goal-directed attentional orienting or whether they simply increase overall arousal levels. Recent innovative approaches to understanding attention behaviorally and biologically6 continue to bring us closer to this possible future of personalized psychiatry, but bringing these efforts across the finish line will depend on continued work across disciplines, from our basic understanding of intact attentional systems to our assessments of how these are disrupted in the context of mental illness. Given that attention impairments are a debilitating symptom for many depressed individuals and are not alleviated by current first-line antidepressant treatments, these translational efforts have the potential to dramatically improve individuals’ quality of life.