Abstract
How do we find what we are looking for? Even when the desired target is in the current field of view, we need to search because fundamental limits on visual processing make it impossible to recognize everything at once. Searching involves directing attention to objects that might be the target. This deployment of attention is not random. It is guided to the most promising items and locations by five factors discussed here: bottom-up salience, top-down feature guidance, scene structure and meaning, the previous history of search over timescales ranging from milliseconds to years, and the relative value of the targets and distractors. Modern theories of visual search need to incorporate all five factors and specify how these factors combine to shape search behaviour. An understanding of the rules of guidance can be used to improve the accuracy and efficiency of socially important search tasks, from security screening to medical image perception.
This is a preview of subscription content, access via your institution
Relevant articles
Open Access articles citing this article.
-
Feature Attention as a Control Mechanism for the Balance of Speed and Accuracy in Visual Search
Computational Brain & Behavior Open Access 13 June 2023
-
Framing the fallibility of Computer-Aided Detection aids cancer detection
Cognitive Research: Principles and Implications Open Access 24 May 2023
-
Assessing the allocation of attention during visual search using digit-tracking, a calibration-free alternative to eye tracking
Scientific Reports Open Access 09 February 2023
Access options
Access Nature and 54 other Nature Portfolio journals
Get Nature+, our best-value online-access subscription
$29.99 / 30 days
cancel any time
Subscribe to this journal
Receive 12 digital issues and online access to articles
$119.00 per year
only $9.92 per issue
Rent or buy this article
Prices vary by article type
from$1.95
to$39.95
Prices may be subject to local taxes which are calculated during checkout




References
Hyman, I. E., Boss, S. M., Wise, B. M., McKenzie, K. E. & Caggiano, J. M. Did you see the unicycling clown? Inattentional blindness while walking and talking on a cell phone. Appl. Cognitive Psych. 24, 597–607 (2010).
Keshvari, S. & Rosenholtz, R. Pooling of continuous features provides a unifying account of crowding. J. Vis. 16, 39 (2016).
Rosenholtz, R., Huang, J. & Ehinger, K. A. Rethinking the role of top-down attention in vision: effects attributable to a lossy representation in peripheral vision. Front. Psychol.http://dx.doi.org/10.3389/fpsyg.2012.00013 (2012).
Wolfe, J. M. What do 1,000,000 trials tell us about visual search? Psychol. Sci. 9, 33–39 (1998).
Moran, R., Zehetleitner, M., Liesefeld, H., Müller, H. & Usher, M. Serial vs. parallel models of attention in visual search: accounting for benchmark RT-distributions. Psychon. B. Rev. 23, 1300–1315 (2015).
Townsend, J. T. & Wenger, M. J. The serial-parallel dilemma: a case study in a linkage of theory and method. Psychon. B. Rev. 11, 391–418 (2004).
Egeth, H. E., Virzi, R. A. & Garbart, H. Searching for conjunctively defined targets. J. Exp. Psychol. Human 10, 32–39 (1984).
Kristjansson, A. Reconsidering visual search. i-Perceptionhttp://dx.doi.org/10.1177/2041669515614670 (2015).
Wolfe, J. M. Visual search revived: the slopes are not that slippery: a comment on Kristjansson (2015). i-Perceptionhttp://dx.doi.org/10.1177/2041669516643244 (2016).
Neider, M. B. & Zelinsky, G. J. Exploring set size effects in scenes: identifying the objects of search. Vis. Cogn. 16, 1–10 (2008).
Wolfe, J. M., Alvarez, G. A., Rosenholtz, R., Kuzmova, Y. I. & Sherman, A. M. Visual search for arbitrary objects in real scenes. Atten. Percept. Psychophys. 73, 1650–1671 (2011).
Kovacs, I. & Julesz, B. A closed curve is much more than an incomplete one: effect of closure in figure-ground segmentation. Proc. Natl Acad. Sci. USA 90, 7495–7497 (1993).
Taylor, S. & Badcock, D. Processing feature density in preattentive perception. Percept. Psychophys. 44, 551–562 (1988).
Wolfe, J. M. & DiMase, J. S. Do intersections serve as basic features in visual search? Perception 32, 645–656 (2003).
Buetti, S., Cronin, D. A., Madison, A. M., Wang, Z. & Lleras, A. Towards a better understanding of parallel visual processing in human vision: evidence for exhaustive analysis of visual information. J. Exp. Psychol. Gen. 145, 672–707 (2016).
Duncan, J. & Humphreys, G. W. Visual search and stimulus similarity. Psychol. Rev. 96, 433–458 (1989).
Koehler, K., Guo, F., Zhang, S. & Eckstein, M. P. What do saliency models predict? J. Vis. 14, 14 (2014).
Koch, C. & Ullman, S. Shifts in selective visual attention: towards the underlying neural circuitry. Human Neurobiol. 4, 219–227 (1985).
Itti, L., Koch, C. & Niebur, E. A model of saliency-based visual attention for rapid scene analysis. IEEE T. Pattern Anal. 20, 1254–1259 (1998).
Itti, L. & Koch, C. A saliency-based search mechanism for overt and covert shifts of visual attention. Vision. Res 40, 1489–1506 (2000).
Bruce, N. D. B., Wloka, C., Frosst, N., Rahman, S. & Tsotsos, J. K. On computational modeling of visual saliency: examining what's right, and what's left. Vision Res. 116, 95–112 (2015).
Zhang, L., Tong, M. H., Marks, T. K., Shan, H. & Cottrell, G. W. SUN: A Bayesian framework for saliency using natural statistics. J. Vis. 8, 1–20 (2008).
Henderson, J. M., Malcolm, G. L. & Schandl, C. Searching in the dark: cognitive relevance drives attention in real-world scenes. Psychon. Bull. Rev. 16, 850–856 (2009).
Tatler, B. W., Hayhoe, M. M., Land, M. F. & Ballard, D. H. Eye guidance in natural vision: reinterpreting salience. J. Vis. 11, 5 (2011).
Nuthmann, A. & Henderson, J. M. Object-based attentional selection in scene viewing. J. Vis. 10, 20 (2010).
Einhäuser, W., Spain, M. & Perona, P. Objects predict fixations better than early saliency. J. Vis. 8, 18 (2008).
Stoll, J., Thrun, M., Nuthmann, A. & Einhäuser, W. Overt attention in natural scenes: objects dominate features. Vision Res. 107, 36–48 (2015).
Maunsell, J. H. & Treue, S. Feature-based attention in visual cortex. Trends Neurosci. 29, 317–322 (2006).
Nordfang, M. & Wolfe, J. M. Guided search for triple conjunctions. Atten. Percept. Psychophys. 76, 1535–1559 (2014).
Friedman-Hill, S. R. & Wolfe, J. M. Second-order parallel processing: visual search for the odd item in a subset. J. Exp. Psychol. Human 21, 531–551 (1995).
Olshausen, B. A. & Field, D. J. Sparse coding of sensory inputs. Curr. Opin. Neurobiol. 14, 481–487 (2004).
DiCarlo, J. J., Zoccolan, D. & Rust, N. C. How does the brain solve visual object recognition? Neuron 73, 415–434 (2012).
Vickery, T. J., King, L.-W. & Jiang, Y. Setting up the target template in visual search. J. Vis. 5, 8 (2005).
Neisser, U. Cognitive Psychology (Appleton-Century-Crofts, 1967).
Treisman, A. & Gelade, G. A feature-integration theory of attention. Cognitive Psychol. 12, 97–136 (1980).
Wolfe, J. M., Cave, K. R. & Franzel, S. L. Guided search: an alternative to the feature integration model for visual search. J. Exp. Psychol. Human 15, 419–433 (1989).
Wolfe, J. M. in Oxford Handbook of Attention (eds Nobre, A. C & Kastner, S. ) 11–55 (Oxford Univ. Press, 2014).
Wolfe, J. M. & Horowitz, T. S. What attributes guide the deployment of visual attention and how do they do it? Nat. Rev. Neurosci. 5, 495–501 (2004).
Alexander, R. G., Schmidt, J. & Zelinsky, G. J. Are summary statistics enough? Evidence for the importance of shape in guiding visual search. Vis. Cogn. 22, 595–609 (2014).
Yamins, D. L. K. & DiCarlo, J. J. Using goal-driven deep learning models to understand sensory cortex. Nat. Neurosci. 19, 356–365 (2016).
Reijnen, E., Wolfe, J. M. & Krummenacher, J. Coarse guidance by numerosity in visual search. Atten. Percept. Psychophys. 75, 16–28 (2013).
Godwin, H. J., Hout, M. C. & Menneer, T. Visual similarity is stronger than semantic similarity in guiding visual search for numbers. Psychon. Bull. Rev. 21, 689–695 (2014).
Gao, T., Newman, G. E. & Scholl, B. J. The psychophysics of chasing: a case study in the perception of animacy. Cogn. Psychol. 59, 154–179 (2009).
Meyerhoff, H. S., Schwan, S. & Huff, M. Perceptual animacy: visual search for chasing objects among distractors. J. Exp Psychol. Human 40, 702–717 (2014).
Notebaert, L., Crombez, G., Van Damme, S., De Houwer, J. & Theeuwes, J. Signals of threat do not capture, but prioritize, attention: a conditioning approach. Emotion 11, 81–89 (2011).
Wolfe, J. M. & Franzel, S. L. Binocularity and visual search. Percept. Psychophys. 44, 81–93 (1988).
Paffen, C., Hooge, I., Benjamins, J. & Hogendoorn, H. A search asymmetry for interocular conflict. Atten. Percept. Psychophys. 73, 1042–1053 (2011).
Paffen, C. L., Hessels, R. S. & Van der Stigchel, S. Interocular conflict attracts attention. Atten. Percept. Psychophys. 74, 251–256 (2012).
Zou, B., Utochkin, I. S., Liu, Y. & Wolfe, J. M. Binocularity and visual search—revisited. Atten. Percept. Psychophys. 79, 473–483 (2016).
Hershler, O. & Hochstein, S . At first sight: a high-level pop out effect for faces. Vision Res. 45, 1707–1724 (2005).
Golan, T., Bentin, S., DeGutis, J. M., Robertson, L. C. & Harel, A. Association and dissociation between detection and discrimination of objects of expertise: evidence from visual search. Atten. Percept. Psychophys. 76, 391–406 (2014).
VanRullen, R. On second glance: still no high-level pop-out effect for faces. Vision Res. 46, 3017–3027 (2006).
Hershler, O. & Hochstein, S. With a careful look: still no low-level confound to face pop-out. Vision Res. 46, 3028–3035 (2006).
Frischen, A., Eastwood, J. D. & Smilek, D. Visual search for faces with emotional expressions. Psychol. Bull. 134, 662–676 (2008).
Dugué, L., McLelland, D., Lajous, M. & VanRullen, R. Attention searches nonuniformly in space and in time. Proc. Natl Acad. Sci. USA 112, 15214–15219 (2015).
Gerritsen, C., Frischen, A., Blake, A., Smilek, D. & Eastwood, J. D. Visual search is not blind to emotion. Percept. Psychophys. 70, 1047–1059 (2008).
Aks, D. J. & Enns, J. T. Visual search for size is influenced by a background texture gradient. J. Exp. Psychol. Human 22, 1467–1481 (1996).
Richards, W. & Kaufman, L. ‘Centre-of-gravity’ tendencies for fixations and flow patterns. Percept. Psychophys 5, 81–84 (1969).
Kuhn, G. & Kingstone, A. Look away! Eyes and arrows engage oculomotor responses automatically. Atten. Percept. Psychophys. 71, 314–327 (2009).
Rensink, R. A. in Human Attention in Digital Environments (ed. Roda, C. ) Ch 3, 63–92 (Cambridge Univ. Press, 2011).
Enns, J. T. & Rensink, R. A. Influence of scene-based properties on visual search. Science 247, 721–723 (1990).
Zhang, X., Huang, J., Yigit-Elliott, S. & Rosenholtz, R. Cube search, revisited. J. Vis. 15, 9 (2015).
Wolfe, J. M. & Myers, L. Fur in the midst of the waters: visual search for material type is inefficient. J. Vis. 10, 8 (2010).
Kunar, M. A. & Watson, D. G. Visual search in a multi-element asynchronous dynamic (MAD) world. J. Exp. Psychol. Human 37, 1017–1031 (2011).
Ehinger, K. A. & Wolfe, J. M. How is visual search guided by shape? Using features from deep learning to understand preattentive “shape space”. In Vision Sciences Society 16th Annual Meeting (2016); http://go.nature.com/2l1azoy
Vickery, T. J., King, L. W. & Jiang, Y. Setting up the target template in visual search. J. Vis. 5, 81–92 (2005).
Biederman, I., Mezzanotte, R. J. & Rabinowitz, J. C. Scene perception: detecting and judging objects undergoing relational violations. Cognitive Psychol. 14, 143–177 (1982).
Henderson, J. M. Object identification in context: the visual processing of natural scenes. Can. J. Psychol. 46, 319–341 (1992).
Henderson, J. M. & Hollingworth, A. High-level scene perception. Annu. Rev. Psychol. 50, 243–271 (1999).
Vo, M. L. & Wolfe, J. M. Differential ERP signatures elicited by semantic and syntactic processing in scenes. Psychol. Sci. 24, 1816–1823 (2013).
‘t Hart, B. M., Schmidt, H. C. E. F., Klein-Harmeyer, I. & Einhä user, W. Attention in natural scenes: contrast affects rapid visual processing and fixations alike. Philos. T. Roy. Soc. B 368, http://dx.doi.org/10.1098/rstb.2013.0067 (2013).
Henderson, J. M., Brockmole, J. R., Castelhano, M. S. & Mack, M. L. in Eye Movement Research: Insights into Mind and Brain (eds van Gompel, R., Fischer, M., Murray, W., & Hill, R. ) 537–562 (Elsevier, 2007).
Rensink, R. A. Seeing, sensing, and scrutinizing. Vision Res. 40, 1469–1487 (2000).
Castelhano, M. S. & Henderson, J. M. Initial scene representations facilitate eye movement guidance in visual search. J. Exp. Psychol. Human 33, 753–763 (2007).
Vo, M. L.-H. & Henderson, J. M. The time course of initial scene processing for eye movement guidance in natural scene search. J. Vis. 10, 14 (2010).
Hollingworth, A. Two forms of scene memory guide visual search: memory for scene context and memory for the binding of target object to scene location. Vis. Cogn. 17, 273–291 (2009).
Oliva, A. in Neurobiology of Attention (eds Itti, L., Rees, G., & Tsotsos, J. ) 251–257 (Academic Press, 2005).
Greene, M. R. & Oliva, A. The briefest of glances: the time course of natural scene understanding. Psychol. Sci. 20, 464–472 (2009).
Castelhano, M. & Heaven, C. Scene context influences without scene gist: eye movements guided by spatial associations in visual search. Psychon. B. Rev. 18, 890–896 (2011).
Malcolm, G. L. & Henderson, J. M. Combining top-down processes to guide eye movements during real-world scene search. J. Vis. 10, 1–11 (2010).
Torralba, A., Oliva, A., Castelhano, M. S. & Henderson, J. M. Contextual guidance of eye movements and attention in real-world scenes: the role of global features on object search. Psychol. Rev. 113, 766–786 (2006).
Vo, M. L. & Wolfe, J. M. When does repeated search in scenes involve memory? Looking at versus looking for objects in scenes. J. Exp. Psychol. Human 38, 23–41 (2012).
Vo, M. L.-H. & Wolfe, J. M. The role of memory for visual search in scenes. Ann. NY Acad. Sci. 1339, 72–81 (2015).
Hillstrom, A. P., Scholey, H., Liversedge, S. P. & Benson, V. The effect of the first glimpse at a scene on eye movements during search. Psychon. B. Rev. 19, 204–210 (2012).
Hwang, A. D., Wang, H.-C. & Pomplun, M. Semantic guidance of eye movements in real-world scenes. Vision Res. 51, 1192–1205 (2011).
Watson, D. G. & Humphreys, G. W. Visual marking: prioritizing selection for new objects by top-down attentional inhibition of old objects. Psychol. Rev. 104, 90–122 (1997).
Donk, M. & Theeuwes, J. Prioritizing selection of new elements: bottom-up versus top-down control. Percept. Psychophys. 65, 1231–1242 (2003).
Maljkovic, V. & Nakayama, K. Priming of popout: I. Role of features. Mem. Cognition 22, 657–672 (1994).
Lamy, D., Zivony, A. & Yashar, A. The role of search difficulty in intertrial feature priming. Vision Res. 51, 2099–2109 (2011).
Wolfe, J., Horowitz, T., Kenner, N. M., Hyle, M. & Vasan, N. How fast can you change your mind? The speed of top-down guidance in visual search. Vision Res. 44, 1411–1426 (2004).
Wolfe, J. M., Butcher, S. J., Lee, C. & Hyle, M. Changing your mind: on the contributions of top-down and bottom-up guidance in visual search for feature singletons. J. Exp. Psychol. Human 29, 483–502 (2003).
Kristjansson, A. Simultaneous priming along multiple feature dimensions in a visual search task. Vision Res. 46, 2554–2570 (2006).
Kristjansson, A. & Driver, J. Priming in visual search: separating the effects of target repetition, distractor repetition and role-reversal. Vision Res. 48, 1217–1232 (2008).
Sigurdardottir, H. M., Kristjansson, A. & Driver, J. Repetition streaks increase perceptual sensitivity in visual search of brief displays. Vis. Cogn. 16, 643–658 (2008).
Kruijne, W. & Meeter, M. Long-term priming of visual search prevails against the passage of time and counteracting instructions. J. Exp. Psychol. Learn. 42, 1293–1303 (2016).
Chun, M. & Jiang, Y. Contextual cuing: implicit learning and memory of visual context guides spatial attention. Cogn. Psychol. 36, 28–71 (1998).
Chun, M. M. & Jiang, Y. Top-down attentional guidance based on implicit learning of visual covariation. Psychol. Sci. 10, 360–365 (1999).
Kunar, M. A., Flusberg, S. J., Horowitz, T. S. & Wolfe, J. M. Does contextual cueing guide the deployment of attention? J. Exp. Psychol. Human 33, 816–828 (2007).
Geyer, T., Zehetleitner, M. & Muller, H. J. Contextual cueing of pop-out visual search: when context guides the deployment of attention. J. Vis. 10, 20 (2010).
Schankin, A. & Schubo, A. Contextual cueing effects despite spatially cued target locations. Psychophysiology 47, 717–727 (2010).
Schankin, A., Hagemann, D. & Schubo, A. Is contextual cueing more than the guidance of visual-spatial attention? Biol. Psychol. 87, 58–65 (2011).
Peterson, M. S. & Kramer, A. F. Attentional guidance of the eyes by contextual information and abrupt onsets. Percept. Psychophys. 63, 1239–1249 (2001).
Tseng, Y. C. & Li, C. S. Oculomotor correlates of context-guided learning in visual search. Percept. Psychophys. 66, 1363–1378 (2004).
Wolfe, J. M., Klempen, N. & Dahlen, K. Post-attentive vision. J. Exp. Psychol. Human 26, 693–716 (2000).
Brockmole, J. R. & Henderson, J. M. Using real-world scenes as contextual cues for search. Vis. Cogn. 13, 99–108 (2006).
Hollingworth, A. & Henderson, J. M. Accurate visual memory for previously attended objects in natural scenes. J. Exp. Psychol. Human 28, 113–136 (2002).
Flowers, J. H. & Lohr, D. J. How does familiarity affect visual search for letter strings? Percept. Psychophys. 37, 557–567 (1985).
Krueger, L. E. The category effect in visual search depends on physical rather than conceptual differences. Percept. Psychophys. 35, 558–564 (1984).
Frith, U. A curious effect with reversed letters explained by a theory of schema. Percept. Psychophys. 16, 113–116 (1974).
Wang, Q., Cavanagh, P. & Green, M. Familiarity and pop-out in visual search. Percept. Psychophy. 56, 495–500 (1994).
Qin, X. A., Koutstaal, W. & Engel, S. The hard-won benefits of familiarity on visual search — familiarity training on brand logos has little effect on search speed and efficiency. Atten. Percept. Psychophys. 76, 914–930 (2014).
Fan, J. E. & Turk-Browne, N. B. Incidental biasing of attention from visual long-term memory. J. Exp. Psychol. Learn. 42, 970–977 (2015).
Huang, L. Familiarity does not aid access to features. Psychon. B. Rev. 18, 278–286 (2011).
Wolfe, J. M., Boettcher, S. E. P., Josephs, E. L., Cunningham, C. A. & Drew, T. You look familiar, but I don't care: lure rejection in hybrid visual and memory search is not based on familiarity. J. Exp. Psychol. Human 41, 1576–1587 (2015).
Anderson, B. A., Laurent, P. A. & Yantis, S. Value-driven attentional capture. Proc. Natl Acad. Sci. USA 108, 10367–10371 (2011).
MacLean, M. & Giesbrecht, B. Irrelevant reward and selection histories have different influences on task-relevant attentional selection. Atten. Percept. Psychophys. 77, 1515–1528 (2015).
Anderson, B. A. & Yantis, S. Persistence of value-driven attentional capture. J. Exp. Psychol. Human 39, 6–9 (2013).
Moran, R., Zehetleitner, M. H., Mueller, H. J. & Usher, M. Competitive guided search: meeting the challenge of benchmark RT distributions. J. Vis. 13, 24 (2013).
Wolfe, J. M. in Integrated Models of Cognitive Systems (ed. Gray, W. ) 99–119 (Oxford Univ. Press, 2007).
Proulx, M. J. & Green, M. Does apparent size capture attention in visual search? Evidence from the Müller–Lyer illusion. J. Vis. 11, 21 (2011).
Kunar, M. A. & Watson, D. G. When are abrupt onsets found efficiently in complex visual search? Evidence from multielement asynchronous dynamic search. J. Exp. Psychol. Human 40, 232–252 (2014).
Shirama, A. Stare in the crowd: frontal face guides overt attention independently of its gaze direction. Perception 41, 447–459 (2012).
von Grunau, M. & Anston, C. The detection of gaze direction: a stare-in-the-crowd effect. Perception 24, 1297–1313 (1995).
Enns, J. T. & MacDonald, S. C. The role of clarity and blur in guiding visual attention in photographs. J. Exp. Psychol. Human 39, 568–578 (2013).
Li, H., Bao, Y., Poppel, E. & Su, Y. H. A unique visual rhythm does not pop out. Cogn. Process. 15, 93–97 (2014).
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Competing interests
J.M.W occasionally serves as an expert witness or consultant (paid or unpaid) on the applications of visual search to topics from legal disputes (for example, how could that truck have hit that clearly visible motorcycle?) to consumer behaviour (for example, how could we redesign this shelf to attract more attention to our product?).
Rights and permissions
About this article
Cite this article
Wolfe, J., Horowitz, T. Five factors that guide attention in visual search. Nat Hum Behav 1, 0058 (2017). https://doi.org/10.1038/s41562-017-0058
Received:
Accepted:
Published:
DOI: https://doi.org/10.1038/s41562-017-0058
This article is cited by
-
Framing the fallibility of Computer-Aided Detection aids cancer detection
Cognitive Research: Principles and Implications (2023)
-
Active visual search in naturalistic environments reflects individual differences in classic visual search performance
Scientific Reports (2023)
-
Assessing the allocation of attention during visual search using digit-tracking, a calibration-free alternative to eye tracking
Scientific Reports (2023)
-
Labor division in collaborative visual search: a review
Psychological Research (2023)
-
Increasing the load on executive working memory reduces the search performance in the natural scenes: Evidence from eye movements
Current Psychology (2023)