Skip to main content

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

Attention during natural vision warps semantic representation across the human brain

Abstract

Little is known about how attention changes the cortical representation of sensory information in humans. On the basis of neurophysiological evidence, we hypothesized that attention causes tuning changes to expand the representation of attended stimuli at the cost of unattended stimuli. To investigate this issue, we used functional magnetic resonance imaging to measure how semantic representation changed during visual search for different object categories in natural movies. We found that many voxels across occipito-temporal and fronto-parietal cortex shifted their tuning toward the attended category. These tuning shifts expanded the representation of the attended category and of semantically related, but unattended, categories, and compressed the representation of categories that were semantically dissimilar to the target. Attentional warping of semantic representation occurred even when the attended category was not present in the movie; thus, the effect was not a target-detection artifact. These results suggest that attention dynamically alters visual representation to optimize processing of behaviorally relevant objects during natural vision.

This is a preview of subscription content

Access options

Buy article

Get time limited or full article access on ReadCube.

$32.00

All prices are NET prices.

Figure 1: Tuning-shift hypothesis predicts that attention warps semantic representation.
Figure 2: Voxel-wise tuning vectors are measured from BOLD responses evoked by natural movies.
Figure 3: Attentional tuning changes for a single voxel in lateral occipital complex.
Figure 4: Attention causes tuning shifts in single voxels.
Figure 5: Attention causes different degrees of tuning shifts in functional ROIs.
Figure 6: Semantic tuning for unattended categories shifts toward the attended category even when no targets are present.
Figure 7: Attention expands the representation of unattended categories that are semantically similar to the attended category.

References

  1. Olshausen, B.A., Anderson, C.H. & Van Essen, D.C. A neurobiological model of visual attention and invariant pattern recognition based on dynamic routing of information. J. Neurosci. 13, 4700–4719 (1993).

    CAS  Article  Google Scholar 

  2. Luck, S.J., Chelazzi, L., Hillyard, S.A. & Desimone, R. Neural mechanisms of spatial selective attention in areas V1, V2, and V4 of macaque visual cortex. J. Neurophysiol. 77, 24–42 (1997).

    CAS  Article  Google Scholar 

  3. McAdams, C.J. & Maunsell, J.H. Effects of attention on orientation-tuning functions of single neurons in macaque cortical area V4. J. Neurosci. 19, 431–441 (1999).

    CAS  Article  Google Scholar 

  4. Reynolds, J.H., Pasternak, T. & Desimone, R. Attention increases sensitivity of V4 neurons. Neuron 26, 703–714 (2000).

    CAS  Article  Google Scholar 

  5. David, S.V., Hayden, B.Y., Mazer, J.A. & Gallant, J.L. Attention to stimulus features shifts spectral tuning of V4 neurons during natural vision. Neuron 59, 509–521 (2008).

    CAS  Article  Google Scholar 

  6. Connor, C.E., Preddie, D.C., Gallant, J.L. & Van Essen, D.C. Spatial attention effects in macaque area V4. J. Neurosci. 17, 3201–3214 (1997).

    CAS  Article  Google Scholar 

  7. Asaad, W.F., Rainer, G. & Miller, E.K. Task-specific neural activity in the primate prefrontal cortex. J. Neurophysiol. 84, 451–459 (2000).

    CAS  Article  Google Scholar 

  8. Warden, M.R. & Miller, E.K. Task-dependent changes in short-term memory in the prefrontal cortex. J. Neurosci. 30, 15801–15810 (2010).

    CAS  Article  Google Scholar 

  9. Johnston, K. & Everling, S. Neural activity in monkey prefrontal cortex is modulated by task context and behavioral instruction during delayed-match-to-sample and conditional prosaccade-antisaccade tasks. J. Cogn. Neurosci. 18, 749–765 (2006).

    Article  Google Scholar 

  10. Mazer, J.A. & Gallant, J.L. Goal-related activity in V4 during free viewing visual search. Evidence for a ventral stream visual salience map. Neuron 40, 1241–1250 (2003).

    CAS  Article  Google Scholar 

  11. Huth, A.G., Nishimoto, S., Vu, A.T. & Gallant, J.L. A continuous semantic space describes the representation of thousands of object and action categories across the human brain. Neuron 76, 1210–1224 (2012).

    CAS  Article  Google Scholar 

  12. Peelen, M.V., Li, F.F. & Kastner, S. Neural mechanisms of rapid natural scene categorization in human visual cortex. Nature 460, 94–97 (2009).

    CAS  Article  Google Scholar 

  13. Li, F.F., VanRullen, R., Koch, C. & Perona, P. Rapid natural scene categorization in the near absence of attention. Proc. Natl. Acad. Sci. USA 99, 9596–9601 (2002).

    CAS  Article  Google Scholar 

  14. O'Craven, K.M., Downing, P.E. & Kanwisher, N. fMRI evidence for objects as the units of attentional selection. Nature 401, 584–587 (1999).

    CAS  Article  Google Scholar 

  15. Reddy, L. & Kanwisher, N. Category selectivity in the ventral visual pathway confers robustness to clutter and diverted attention. Curr. Biol. 17, 2067–2072 (2007).

    CAS  Article  Google Scholar 

  16. Bartels, A. & Zeki, S. Functional brain mapping during free viewing of natural scenes. Hum. Brain Mapp. 21, 75–85 (2004).

    Article  Google Scholar 

  17. Hasson, U., Nir, Y., Levy, I., Fuhrmann, G. & Malach, R. Intersubject synchronization of cortical activity during natural vision. Science 303, 1634–1640 (2004).

    CAS  Article  Google Scholar 

  18. Kay, K.N., Naselaris, T., Prenger, R.J. & Gallant, J.L. Identifying natural images from human brain activity. Nature 452, 352–355 (2008).

    CAS  Article  Google Scholar 

  19. Nishimoto, S. et al. Reconstructing visual experiences from brain activity evoked by natural movies. Curr. Biol. 21, 1641–1646 (2011).

    CAS  Article  Google Scholar 

  20. Naselaris, T., Prenger, R.J., Kay, K.N., Oliver, M. & Gallant, J.L. Bayesian reconstruction of natural images from human brain activity. Neuron 63, 902–915 (2009).

    CAS  Article  Google Scholar 

  21. Mitchell, T.M. et al. Predicting human brain activity associated with the meanings of nouns. Science 320, 1191–1195 (2008).

    CAS  Article  Google Scholar 

  22. Miller, G. WordNet: a lexical database for English. Commun. ACM 38, 39–41 (1995).

    Article  Google Scholar 

  23. Benjamini, Y. & Yekutieli, D. The control of the false discovery rate in multiple testing under dependency. Ann. Stat. 29, 1165–1188 (2001).

    Article  Google Scholar 

  24. Corbetta, M. & Shulman, G.L. Control of goal-directed and stimulus-driven attention in the brain. Nat. Rev. Neurosci. 3, 201–215 (2002).

    CAS  Article  Google Scholar 

  25. Carter, C.S. Anterior cingulate cortex, error detection and the online monitoring of performance. Science 280, 747–749 (1998).

    CAS  Article  Google Scholar 

  26. Womelsdorf, T., Anton-Erxleben, K., Pieper, F. & Treue, S. Dynamic shifts of visual receptive fields in cortical area MT by spatial attention. Nat. Neurosci. 9, 1156–1160 (2006).

    CAS  Article  Google Scholar 

  27. Haxby, J.V. et al. Distributed and overlapping representations of faces and objects in ventral temporal cortex. Science 293, 2425–2430 (2001).

    CAS  Article  Google Scholar 

  28. Turin, G. An introduction to matched filters. IEEE Trans. Inf. Theory 6, 311–329 (1960).

    Article  Google Scholar 

  29. Navalpakkam, V. & Itti, L. Search goal tunes visual features optimally. Neuron 53, 605–617 (2007).

    CAS  Article  Google Scholar 

  30. David, S.V. & Gallant, J.L. Predicting neuronal responses during natural vision. Network 16, 239–260 (2005).

    Article  Google Scholar 

Download references

Acknowledgements

We thank D. Whitney for discussions regarding this manuscript. We also thank J. Gao, N. Bilenko, T. Naselaris, A. Vu and M. Oliver for their help in various aspects of this research. This work was supported by the National Eye Institute (EY019684 and EY022454) and the Center for Science of Information, a National Science Foundation Science and Technology Center, under grant agreement CCF-0939370.

Author information

Affiliations

Authors

Contributions

T.Ç. and S.N. designed the experiments. T.Ç. and A.G.H. operated the scanner. T.Ç. conducted the experiments and analyzed the data. T.Ç. and J.L.G. wrote the manuscript. J.L.G. provided guidance on all aspects of the project.

Corresponding author

Correspondence to Jack L Gallant.

Ethics declarations

Competing interests

The authors declare no competing financial interests.

Supplementary information

Supplementary Text and Figures

Supplementary Figures 1–13 and Supplementary Tables 1 and 2 (PDF 23321 kb)

Rights and permissions

Reprints and Permissions

About this article

Cite this article

Çukur, T., Nishimoto, S., Huth, A. et al. Attention during natural vision warps semantic representation across the human brain. Nat Neurosci 16, 763–770 (2013). https://doi.org/10.1038/nn.3381

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1038/nn.3381

Further reading

Search

Quick links

Nature Briefing

Sign up for the Nature Briefing newsletter — what matters in science, free to your inbox daily.

Get the most important science stories of the day, free in your inbox. Sign up for Nature Briefing