Review Article | Published:

Five factors that guide attention in visual search

Nature Human Behaviour volume 1, Article number: 0058 (2017) | Download Citation

Abstract

How do we find what we are looking for? Even when the desired target is in the current field of view, we need to search because fundamental limits on visual processing make it impossible to recognize everything at once. Searching involves directing attention to objects that might be the target. This deployment of attention is not random. It is guided to the most promising items and locations by five factors discussed here: bottom-up salience, top-down feature guidance, scene structure and meaning, the previous history of search over timescales ranging from milliseconds to years, and the relative value of the targets and distractors. Modern theories of visual search need to incorporate all five factors and specify how these factors combine to shape search behaviour. An understanding of the rules of guidance can be used to improve the accuracy and efficiency of socially important search tasks, from security screening to medical image perception.

Access optionsAccess options

Rent or Buy article

Get time limited or full article access on ReadCube.

from$8.99

All prices are NET prices.

References

  1. 1.

    , , , & Did you see the unicycling clown? Inattentional blindness while walking and talking on a cell phone. Appl. Cognitive Psych. 24, 597–607 (2010).

  2. 2.

    & Pooling of continuous features provides a unifying account of crowding. J. Vis. 16, 39 (2016).

  3. 3.

    , & Rethinking the role of top-down attention in vision: effects attributable to a lossy representation in peripheral vision. Front. Psychol. (2012).

  4. 4.

    What do 1,000,000 trials tell us about visual search? Psychol. Sci. 9, 33–39 (1998).

  5. 5.

    , , , & Serial vs. parallel models of attention in visual search: accounting for benchmark RT-distributions. Psychon. B. Rev. 23, 1300–1315 (2015).

  6. 6.

    & The serial-parallel dilemma: a case study in a linkage of theory and method. Psychon. B. Rev. 11, 391–418 (2004).

  7. 7.

    , & Searching for conjunctively defined targets. J. Exp. Psychol. Human 10, 32–39 (1984).

  8. 8.

    Reconsidering visual search. i-Perception (2015).

  9. 9.

    Visual search revived: the slopes are not that slippery: a comment on Kristjansson (2015). i-Perception (2016).

  10. 10.

    & Exploring set size effects in scenes: identifying the objects of search. Vis. Cogn. 16, 1–10 (2008).

  11. 11.

    , , , & Visual search for arbitrary objects in real scenes. Atten. Percept. Psychophys. 73, 1650–1671 (2011).

  12. 12.

    & A closed curve is much more than an incomplete one: effect of closure in figure-ground segmentation. Proc. Natl Acad. Sci. USA 90, 7495–7497 (1993).

  13. 13.

    & Processing feature density in preattentive perception. Percept. Psychophys. 44, 551–562 (1988).

  14. 14.

    & Do intersections serve as basic features in visual search? Perception 32, 645–656 (2003).

  15. 15.

    , , , & Towards a better understanding of parallel visual processing in human vision: evidence for exhaustive analysis of visual information. J. Exp. Psychol. Gen. 145, 672–707 (2016).

  16. 16.

    & Visual search and stimulus similarity. Psychol. Rev. 96, 433–458 (1989).

  17. 17.

    , , & What do saliency models predict? J. Vis. 14, 14 (2014).

  18. 18.

    & Shifts in selective visual attention: towards the underlying neural circuitry. Human Neurobiol. 4, 219–227 (1985).

  19. 19.

    , & A model of saliency-based visual attention for rapid scene analysis. IEEE T. Pattern Anal. 20, 1254–1259 (1998).

  20. 20.

    & A saliency-based search mechanism for overt and covert shifts of visual attention. Vision. Res 40, 1489–1506 (2000).

  21. 21.

    , , , & On computational modeling of visual saliency: examining what's right, and what's left. Vision Res. 116, 95–112 (2015).

  22. 22.

    , , , & SUN: A Bayesian framework for saliency using natural statistics. J. Vis. 8, 1–20 (2008).

  23. 23.

    , & Searching in the dark: cognitive relevance drives attention in real-world scenes. Psychon. Bull. Rev. 16, 850–856 (2009).

  24. 24.

    , , & Eye guidance in natural vision: reinterpreting salience. J. Vis. 11, 5 (2011).

  25. 25.

    & Object-based attentional selection in scene viewing. J. Vis. 10, 20 (2010).

  26. 26.

    , & Objects predict fixations better than early saliency. J. Vis. 8, 18 (2008).

  27. 27.

    , , & Overt attention in natural scenes: objects dominate features. Vision Res. 107, 36–48 (2015).

  28. 28.

    & Feature-based attention in visual cortex. Trends Neurosci. 29, 317–322 (2006).

  29. 29.

    & Guided search for triple conjunctions. Atten. Percept. Psychophys. 76, 1535–1559 (2014).

  30. 30.

    & Second-order parallel processing: visual search for the odd item in a subset. J. Exp. Psychol. Human 21, 531–551 (1995).

  31. 31.

    & Sparse coding of sensory inputs. Curr. Opin. Neurobiol. 14, 481–487 (2004).

  32. 32.

    , & How does the brain solve visual object recognition? Neuron 73, 415–434 (2012).

  33. 33.

    , & Setting up the target template in visual search. J. Vis. 5, 8 (2005).

  34. 34.

    Cognitive Psychology (Appleton-Century-Crofts, 1967).

  35. 35.

    & A feature-integration theory of attention. Cognitive Psychol. 12, 97–136 (1980).

  36. 36.

    , & Guided search: an alternative to the feature integration model for visual search. J. Exp. Psychol. Human 15, 419–433 (1989).

  37. 37.

    in Oxford Handbook of Attention (eds Nobre, A. C & Kastner, S.) 11–55 (Oxford Univ. Press, 2014).

  38. 38.

    & What attributes guide the deployment of visual attention and how do they do it? Nat. Rev. Neurosci. 5, 495–501 (2004).

  39. 39.

    , & Are summary statistics enough? Evidence for the importance of shape in guiding visual search. Vis. Cogn. 22, 595–609 (2014).

  40. 40.

    & Using goal-driven deep learning models to understand sensory cortex. Nat. Neurosci. 19, 356–365 (2016).

  41. 41.

    , & Coarse guidance by numerosity in visual search. Atten. Percept. Psychophys. 75, 16–28 (2013).

  42. 42.

    , & Visual similarity is stronger than semantic similarity in guiding visual search for numbers. Psychon. Bull. Rev. 21, 689–695 (2014).

  43. 43.

    , & The psychophysics of chasing: a case study in the perception of animacy. Cogn. Psychol. 59, 154–179 (2009).

  44. 44.

    , & Perceptual animacy: visual search for chasing objects among distractors. J. Exp Psychol. Human 40, 702–717 (2014).

  45. 45.

    , , , & Signals of threat do not capture, but prioritize, attention: a conditioning approach. Emotion 11, 81–89 (2011).

  46. 46.

    & Binocularity and visual search. Percept. Psychophys. 44, 81–93 (1988).

  47. 47.

    , , & A search asymmetry for interocular conflict. Atten. Percept. Psychophys. 73, 1042–1053 (2011).

  48. 48.

    , & Interocular conflict attracts attention. Atten. Percept. Psychophys. 74, 251–256 (2012).

  49. 49.

    , , & Binocularity and visual search—revisited. Atten. Percept. Psychophys. 79, 473–483 (2016).

  50. 50.

    & . At first sight: a high-level pop out effect for faces. Vision Res. 45, 1707–1724 (2005).

  51. 51.

    , , , & Association and dissociation between detection and discrimination of objects of expertise: evidence from visual search. Atten. Percept. Psychophys. 76, 391–406 (2014).

  52. 52.

    On second glance: still no high-level pop-out effect for faces. Vision Res. 46, 3017–3027 (2006).

  53. 53.

    & With a careful look: still no low-level confound to face pop-out. Vision Res. 46, 3028–3035 (2006).

  54. 54.

    , & Visual search for faces with emotional expressions. Psychol. Bull. 134, 662–676 (2008).

  55. 55.

    , , & Attention searches nonuniformly in space and in time. Proc. Natl Acad. Sci. USA 112, 15214–15219 (2015).

  56. 56.

    , , , & Visual search is not blind to emotion. Percept. Psychophys. 70, 1047–1059 (2008).

  57. 57.

    & Visual search for size is influenced by a background texture gradient. J. Exp. Psychol. Human 22, 1467–1481 (1996).

  58. 58.

    & ‘Centre-of-gravity’ tendencies for fixations and flow patterns. Percept. Psychophys 5, 81–84 (1969).

  59. 59.

    & Look away! Eyes and arrows engage oculomotor responses automatically. Atten. Percept. Psychophys. 71, 314–327 (2009).

  60. 60.

    in Human Attention in Digital Environments (ed. Roda, C.) Ch 3, 63–92 (Cambridge Univ. Press, 2011).

  61. 61.

    & Influence of scene-based properties on visual search. Science 247, 721–723 (1990).

  62. 62.

    , , & Cube search, revisited. J. Vis. 15, 9 (2015).

  63. 63.

    & Fur in the midst of the waters: visual search for material type is inefficient. J. Vis. 10, 8 (2010).

  64. 64.

    & Visual search in a multi-element asynchronous dynamic (MAD) world. J. Exp. Psychol. Human 37, 1017–1031 (2011).

  65. 65.

    & How is visual search guided by shape? Using features from deep learning to understand preattentive “shape space”. In Vision Sciences Society 16th Annual Meeting (2016);

  66. 66.

    , & Setting up the target template in visual search. J. Vis. 5, 81–92 (2005).

  67. 67.

    , & Scene perception: detecting and judging objects undergoing relational violations. Cognitive Psychol. 14, 143–177 (1982).

  68. 68.

    Object identification in context: the visual processing of natural scenes. Can. J. Psychol. 46, 319–341 (1992).

  69. 69.

    & High-level scene perception. Annu. Rev. Psychol. 50, 243–271 (1999).

  70. 70.

    & Differential ERP signatures elicited by semantic and syntactic processing in scenes. Psychol. Sci. 24, 1816–1823 (2013).

  71. 71.

    , , & Einhä Attention in natural scenes: contrast affects rapid visual processing and fixations alike. Philos. T. Roy. Soc. B 368, (2013).

  72. 72.

    , , & in Eye Movement Research: Insights into Mind and Brain (eds van Gompel, R., Fischer, M., Murray, W., & Hill, R.) 537–562 (Elsevier, 2007).

  73. 73.

    Seeing, sensing, and scrutinizing. Vision Res. 40, 1469–1487 (2000).

  74. 74.

    & Initial scene representations facilitate eye movement guidance in visual search. J. Exp. Psychol. Human 33, 753–763 (2007).

  75. 75.

    & The time course of initial scene processing for eye movement guidance in natural scene search. J. Vis. 10, 14 (2010).

  76. 76.

    Two forms of scene memory guide visual search: memory for scene context and memory for the binding of target object to scene location. Vis. Cogn. 17, 273–291 (2009).

  77. 77.

    in Neurobiology of Attention (eds Itti, L., Rees, G., & Tsotsos, J.) 251–257 (Academic Press, 2005).

  78. 78.

    & The briefest of glances: the time course of natural scene understanding. Psychol. Sci. 20, 464–472 (2009).

  79. 79.

    & Scene context influences without scene gist: eye movements guided by spatial associations in visual search. Psychon. B. Rev. 18, 890–896 (2011).

  80. 80.

    & Combining top-down processes to guide eye movements during real-world scene search. J. Vis. 10, 1–11 (2010).

  81. 81.

    , , & Contextual guidance of eye movements and attention in real-world scenes: the role of global features on object search. Psychol. Rev. 113, 766–786 (2006).

  82. 82.

    & When does repeated search in scenes involve memory? Looking at versus looking for objects in scenes. J. Exp. Psychol. Human 38, 23–41 (2012).

  83. 83.

    & The role of memory for visual search in scenes. Ann. NY Acad. Sci. 1339, 72–81 (2015).

  84. 84.

    , , & The effect of the first glimpse at a scene on eye movements during search. Psychon. B. Rev. 19, 204–210 (2012).

  85. 85.

    , & Semantic guidance of eye movements in real-world scenes. Vision Res. 51, 1192–1205 (2011).

  86. 86.

    & Visual marking: prioritizing selection for new objects by top-down attentional inhibition of old objects. Psychol. Rev. 104, 90–122 (1997).

  87. 87.

    & Prioritizing selection of new elements: bottom-up versus top-down control. Percept. Psychophys. 65, 1231–1242 (2003).

  88. 88.

    & Priming of popout: I. Role of features. Mem. Cognition 22, 657–672 (1994).

  89. 89.

    , & The role of search difficulty in intertrial feature priming. Vision Res. 51, 2099–2109 (2011).

  90. 90.

    , , , & How fast can you change your mind? The speed of top-down guidance in visual search. Vision Res. 44, 1411–1426 (2004).

  91. 91.

    , , & Changing your mind: on the contributions of top-down and bottom-up guidance in visual search for feature singletons. J. Exp. Psychol. Human 29, 483–502 (2003).

  92. 92.

    Simultaneous priming along multiple feature dimensions in a visual search task. Vision Res. 46, 2554–2570 (2006).

  93. 93.

    & Priming in visual search: separating the effects of target repetition, distractor repetition and role-reversal. Vision Res. 48, 1217–1232 (2008).

  94. 94.

    , & Repetition streaks increase perceptual sensitivity in visual search of brief displays. Vis. Cogn. 16, 643–658 (2008).

  95. 95.

    & Long-term priming of visual search prevails against the passage of time and counteracting instructions. J. Exp. Psychol. Learn. 42, 1293–1303 (2016).

  96. 96.

    & Contextual cuing: implicit learning and memory of visual context guides spatial attention. Cogn. Psychol. 36, 28–71 (1998).

  97. 97.

    & Top-down attentional guidance based on implicit learning of visual covariation. Psychol. Sci. 10, 360–365 (1999).

  98. 98.

    , , & Does contextual cueing guide the deployment of attention? J. Exp. Psychol. Human 33, 816–828 (2007).

  99. 99.

    , & Contextual cueing of pop-out visual search: when context guides the deployment of attention. J. Vis. 10, 20 (2010).

  100. 100.

    & Contextual cueing effects despite spatially cued target locations. Psychophysiology 47, 717–727 (2010).

  101. 101.

    , & Is contextual cueing more than the guidance of visual-spatial attention? Biol. Psychol. 87, 58–65 (2011).

  102. 102.

    & Attentional guidance of the eyes by contextual information and abrupt onsets. Percept. Psychophys. 63, 1239–1249 (2001).

  103. 103.

    & Oculomotor correlates of context-guided learning in visual search. Percept. Psychophys. 66, 1363–1378 (2004).

  104. 104.

    , & Post-attentive vision. J. Exp. Psychol. Human 26, 693–716 (2000).

  105. 105.

    & Using real-world scenes as contextual cues for search. Vis. Cogn. 13, 99–108 (2006).

  106. 106.

    & Accurate visual memory for previously attended objects in natural scenes. J. Exp. Psychol. Human 28, 113–136 (2002).

  107. 107.

    & How does familiarity affect visual search for letter strings? Percept. Psychophys. 37, 557–567 (1985).

  108. 108.

    The category effect in visual search depends on physical rather than conceptual differences. Percept. Psychophys. 35, 558–564 (1984).

  109. 109.

    A curious effect with reversed letters explained by a theory of schema. Percept. Psychophys. 16, 113–116 (1974).

  110. 110.

    , & Familiarity and pop-out in visual search. Percept. Psychophy. 56, 495–500 (1994).

  111. 111.

    , & The hard-won benefits of familiarity on visual search — familiarity training on brand logos has little effect on search speed and efficiency. Atten. Percept. Psychophys. 76, 914–930 (2014).

  112. 112.

    & Incidental biasing of attention from visual long-term memory. J. Exp. Psychol. Learn. 42, 970–977 (2015).

  113. 113.

    Familiarity does not aid access to features. Psychon. B. Rev. 18, 278–286 (2011).

  114. 114.

    , , , & You look familiar, but I don't care: lure rejection in hybrid visual and memory search is not based on familiarity. J. Exp. Psychol. Human 41, 1576–1587 (2015).

  115. 115.

    , & Value-driven attentional capture. Proc. Natl Acad. Sci. USA 108, 10367–10371 (2011).

  116. 116.

    & Irrelevant reward and selection histories have different influences on task-relevant attentional selection. Atten. Percept. Psychophys. 77, 1515–1528 (2015).

  117. 117.

    & Persistence of value-driven attentional capture. J. Exp. Psychol. Human 39, 6–9 (2013).

  118. 118.

    , , & Competitive guided search: meeting the challenge of benchmark RT distributions. J. Vis. 13, 24 (2013).

  119. 119.

    in Integrated Models of Cognitive Systems (ed. Gray, W.) 99–119 (Oxford Univ. Press, 2007).

  120. 120.

    & Does apparent size capture attention in visual search? Evidence from the Müller–Lyer illusion. J. Vis. 11, 21 (2011).

  121. 121.

    & When are abrupt onsets found efficiently in complex visual search? Evidence from multielement asynchronous dynamic search. J. Exp. Psychol. Human 40, 232–252 (2014).

  122. 122.

    Stare in the crowd: frontal face guides overt attention independently of its gaze direction. Perception 41, 447–459 (2012).

  123. 123.

    & The detection of gaze direction: a stare-in-the-crowd effect. Perception 24, 1297–1313 (1995).

  124. 124.

    & The role of clarity and blur in guiding visual attention in photographs. J. Exp. Psychol. Human 39, 568–578 (2013).

  125. 125.

    , , & A unique visual rhythm does not pop out. Cogn. Process. 15, 93–97 (2014).

Download references

Author information

Affiliations

  1. Visual Attention Lab, Department of Surgery, Brigham and Women's Hospital, 64 Sidney Street, Suite 170, Cambridge, Massachusetts 02139-4170, USA.

    • Jeremy M. Wolfe
  2. Basic Biobehavioral and Psychological Sciences Branch, Behavioral Research Program, Division of Cancer Control and Population Sciences, National Cancer Institute, 9609 Medical Center Drive, 3E-116, Rockville, Maryland 20850, USA.

    • Todd S. Horowitz

Authors

  1. Search for Jeremy M. Wolfe in:

  2. Search for Todd S. Horowitz in:

Competing interests

J.M.W occasionally serves as an expert witness or consultant (paid or unpaid) on the applications of visual search to topics from legal disputes (for example, how could that truck have hit that clearly visible motorcycle?) to consumer behaviour (for example, how could we redesign this shelf to attract more attention to our product?).

Corresponding author

Correspondence to Jeremy M. Wolfe.

About this article

Publication history

Received

Accepted

Published

DOI

https://doi.org/10.1038/s41562-017-0058

Further reading