Skip to main content

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

  • Review Article
  • Published:

Incorporating the properties of peripheral vision into theories of visual search

Abstract

People often look for objects in their immediate environment, a behaviour known as visual search. Most of the visual signals used during search come from peripheral vision, outside the direct focus of the eyes. In this Review, we present evidence that peripheral vision is both more capable and more complex than commonly believed. We then use three benchmark findings from the visual search literature to illustrate how considering peripheral vision can improve understanding of the basic mechanisms of search. Next, we discuss theories of visual search on the basis of their treatment of peripheral processing constraints and present findings in support of theories that integrate the characteristics of peripheral vision. These findings describe the span over which peripheral vision can extract useful information, the type of information peripheral vision encodes, and how peripheral vision identifies locations that are likely to contain a search target. We end by discussing considerations for future theoretical development and recommendations for future empirical research.

This is a preview of subscription content, access via your institution

Access options

Buy this article

Prices may be subject to local taxes which are calculated during checkout

Fig. 1: Real life visual search.
Fig. 2: The effect of eccentricity on visual processing.
Fig. 3: Search conditions and search slopes.
Fig. 4: Elements of periphery-constrained theories.
Fig. 5: Peripherally informed visual search.

Similar content being viewed by others

References

  1. Evans, K. K., Birdwell, R. L. & Wolfe, J. M. If you don’t find it often, you often don’t find it: why some cancers are missed in breast cancer screening. PLoS ONE 8, e64366 (2013).

    Article  PubMed  PubMed Central  Google Scholar 

  2. Biggs, A. T., Cain, M. S., Clark, K., Darling, E. F. & Mitroff, S. R. Assessing visual search performance differences between Transportation Security Administration Officers and nonprofessional visual searchers. Vis. Cognit. 21, 330–352 (2013).

    Article  Google Scholar 

  3. Wandell, B. A. Foundations of Vision (Sinauer Associates, 1995).

  4. Curcio, C. A., Sloan, K. R., Kalina, R. E. & Hendrickson, A. E. Human photoreceptor topography. J. Comp. Neurol. 292, 497–523 (1990).

    Article  PubMed  Google Scholar 

  5. Abramov, I., Gordon, J. & Chan, H. Color appearance in the peripheral retina: effects of stimulus size. J. Opt. Soc. Am. A 8, 404–414 (1991).

    Article  PubMed  Google Scholar 

  6. Hansen, T., Pracejus, L. & Gegenfurtner, K. R. Color perception in the intermediate periphery of the visual field. J. Vis. 9, 26–26 (2009).

    Article  PubMed  Google Scholar 

  7. Martin, P. R., Lee, B. B., White, A. J., Solomon, S. G. & Rüttiger, L. Chromatic sensitivity of ganglion cells in the peripheral primate retina. Nature 410, 933–936 (2001).

    Article  PubMed  Google Scholar 

  8. Mullen, K. T. Differential distributions of red–green and blue–yellow cone opponency across the visual field. Vis. Neurosci. 19, 109–118 (2002).

    Article  PubMed  Google Scholar 

  9. Mullen, K. T., Sakurai, M. & Chu, W. Does L/M cone opponency disappear in human periphery? Perception 34, 951–959 (2005).

    Article  PubMed  Google Scholar 

  10. Rosenholtz, R. Capabilities and limitations of peripheral vision. Annu. Rev. Vis. Sci. 2, 437–457 (2016).

    Article  PubMed  Google Scholar 

  11. Anstis, S. Picturing peripheral acuity. Perception 27, 817–825 (1998).

    Article  PubMed  Google Scholar 

  12. Anstis, S. M. A chart demonstrating variations in acuity with retinal position. Vis. Res. 14, 589–592 (1974).

    Article  PubMed  Google Scholar 

  13. Findlay, J. M. & Gilchrist, I. D. Active Vision: the Psychology of Looking and Seeing (Oxford Univ. Press, 2003).

  14. Levi, D. M., Klein, S. A. & Aitsebaomo, A. Vernier acuity, crowding and cortical magnification. Vis. Res. 25, 963–977 (1985).

    Article  PubMed  Google Scholar 

  15. Rovamo, J., Leinonen, L., Laurinen, P. & Virsu, V. Temporal integration and contrast sensitivity in foveal and peripheral vision. Perception 13, 665–674 (1984).

    Article  PubMed  Google Scholar 

  16. Wertheim, T. Über die indirekte Sehschärfe. Z. Psychol. Physiol. Sinnesorgane 7, 172–187 (1894).

    Google Scholar 

  17. Daniel, P. & Whitteridge, D. The representation of the visual field on the cerebral cortex in monkeys. J. Physiol. 159, 203–221 (1961).

    Article  PubMed  PubMed Central  Google Scholar 

  18. Cowey, A. & Rolls, E. Human cortical magnification factor and its relation to visual acuity. Exp. Brain Res. 21, 447–454 (1974).

    Article  PubMed  Google Scholar 

  19. Duncan, R. O. & Boynton, G. M. Cortical magnification within human primary visual cortex correlates with acuity thresholds. Neuron 38, 659–671 (2003).

    Article  PubMed  Google Scholar 

  20. Rönne, H. Zur theorie und technik der Bjerrumschen gesichtsfelduntersuchung. Arch. Augenheilkd. 78, 284–301 (1915).

    Google Scholar 

  21. Strasburger, H. Seven myths on crowding and peripheral vision. i-Perception 11, 2041669520913052 (2020). This paper is a comprehensive review of crowding, including biological mechanisms, common measurements and behavioural impacts.

    Article  PubMed  PubMed Central  Google Scholar 

  22. Traquair, H. M. An Introduction to Clinical Perimetry, Ch. 1. 4–5 (Henry Kimpton, 1938).

  23. Oesterberg, G. Topography of the Layer of Rods and Cones in the Human Retina (NYT Nordisk Forlag: Arnold Busck, 1935).

  24. Tyler, C. W. Peripheral color demo. i-Perception 6, 2041669515613671 (2015).

    Article  PubMed  PubMed Central  Google Scholar 

  25. Castelhano, M. S. & Henderson, J. M. The influence of color on the perception of scene gist. J. Exp. Psychol. Hum. Percept. Perform. 34, 660 (2008).

    Article  PubMed  Google Scholar 

  26. Oliva, A. & Schyns, P. G. Diagnostic colors mediate scene recognition. Cognit. Psychol. 41, 176–210 (2000).

    Article  PubMed  Google Scholar 

  27. Oliva, A. & Torralba, A. Modeling the shape of the scene: a holistic representation of the spatial envelope. Int. J. Comput. Vis. 42, 145–175 (2001).

    Article  Google Scholar 

  28. Rousselet, G., Joubert, O. & Fabre-Thorpe, M. How long to get to the “gist” of real-world natural scenes? Vis. Cognit. 12, 852–877 (2005).

    Article  Google Scholar 

  29. Torralba, A. How many pixels make an image. Vis. Neurosci. 26, 123–131 (2009).

    Article  PubMed  Google Scholar 

  30. Wurm, L. H., Legge, G. E., Isenberg, L. M. & Luebker, A. Color improves object recognition in normal and low vision. J. Exp. Psychol. Hum. Percept. Perform. 19, 899 (1993).

    Article  PubMed  Google Scholar 

  31. McKee, S. P. & Nakayama, K. The detection of motion in the peripheral visual field. Vis. Res. 24, 25–32 (1984).

    Article  PubMed  Google Scholar 

  32. Johnston, A. & Wright, M. Lower thresholds of motion for gratings as a function of eccentricity and contrast. Vis. Res. 25, 179–185 (1985).

    Article  PubMed  Google Scholar 

  33. Johnston, A. & Wright, M. Matching velocity in central and peripheral vision. Vis. Res. 26, 1099–1109 (1986).

    Article  PubMed  Google Scholar 

  34. Strasburger, H., Rentschler, I. & Jüttner, M. Peripheral vision and pattern recognition: a review. J. Vis. 11, 13–13 (2011). This paper provides a thorough review of the properties of peripheral vision.

    Article  PubMed  Google Scholar 

  35. Carrasco, M., Evert, D. L., Chang, I. & Katz, S. M. The eccentricity effect: target eccentricity affects performance on conjunction searches. Percept. Psychophys. 57, 1241–1261 (1995).

    Article  PubMed  Google Scholar 

  36. Scialfa, C. T. & Joffe, K. M. Response times and eye movements in feature and conjunction search as a function of target eccentricity. Percept. Psychophys. 60, 1067–1082 (1998).

    Article  PubMed  Google Scholar 

  37. Wang, Z., Lleras, A. & Buetti, S. Parallel, exhaustive processing underlies logarithmic search functions: visual search with cortical magnification. Psychon. Bull. Rev. 25, 1343–1350 (2018).

    Article  PubMed  Google Scholar 

  38. Geisler, W. S. & Chou, K.-L. Separation of low-level and high-level factors in complex tasks: visual search. Psychol. Rev. 102, 356 (1995).

    Article  PubMed  Google Scholar 

  39. Kee, D., Jung, E. S. & Chung, M. K. Isoresponse time regions for the evaluation of visual search performance in ergonomic interface models. Ergonomics 35, 243–252 (1992).

    Article  PubMed  Google Scholar 

  40. Bouma, H. Interaction effects in parafoveal letter recognition. Nature 226, 177–178 (1970).

    Article  PubMed  Google Scholar 

  41. Pelli, D. G., Palomares, M. & Majaj, N. J. Crowding is unlike ordinary masking: distinguishing feature integration from detection. J. Vis. 4, 12–12 (2004).

    Article  Google Scholar 

  42. Freeman, J. & Simoncelli, E. P. Metamers of the ventral stream. Nat. Neurosci. 14, 1195–1201 (2011).

    Article  PubMed  PubMed Central  Google Scholar 

  43. Gatys, L., Ecker, A. S. & Bethge, M. Texture synthesis using convolutional neural networks. Adv. Neural Inf. Process. Syst. 28, 262–270 (2015).

    Google Scholar 

  44. Portilla, J. & Simoncelli, E. P. A parametric texture model based on joint statistics of complex wavelet coefficients. Int. J. Comput. Vis. 40, 49–70 (2000).

    Article  Google Scholar 

  45. Stuart, J. A. & Burian, H. M. A study of separation difficulty: its relationship to visual acuity in normal and amblyopic eyes. Am. J. Ophthalmol. 53, 471–477 (1962).

    Article  PubMed  Google Scholar 

  46. Ehlers, H. The movements of the eyes during reading. Acta Ophthalmol. 14, 56–63 (1936).

    Article  Google Scholar 

  47. Pelli, D. G. Crowding: a cortical constraint on object recognition. Curr. Opin. Neurobiol. 18, 445–451 (2008).

    Article  PubMed  PubMed Central  Google Scholar 

  48. Ringer, R. V., Coy, A. M., Larson, A. M. & Loschky, L. C. Investigating visual crowding of objects in complex real-world scenes. Iperception 12, 2041669521994150 (2021).

    PubMed  PubMed Central  Google Scholar 

  49. Whitney, D. & Levi, D. M. Visual crowding: a fundamental limit on conscious perception and object recognition. Trends Cognit. Sci. 15, 160–168 (2011).

    Article  Google Scholar 

  50. Herzog, M. H. & Manassi, M. Uncorking the bottleneck of crowding: a fresh look at object recognition. Curr. Opin. Behav. Sci. 1, 86–93 (2015).

    Article  Google Scholar 

  51. Gheri, C., Morgan, M. J. & Solomon, J. A. The relationship between search efficiency and crowding. Perception 36, 1779–1787 (2007).

    Article  PubMed  PubMed Central  Google Scholar 

  52. Poder, E. & Wagemans, J. Crowding with conjunctions of simple features. J. Vis. 7, 23 (2007).

    Article  PubMed  Google Scholar 

  53. Kennedy, G. J. & Whitaker, D. The chromatic selectivity of visual crowding. J. Vis. 10, 15 (2010).

    Article  PubMed  Google Scholar 

  54. Kooi, F. L., Toet, A., Tripathy, S. P. & Levi, D. M. The effect of similarity and duration on spatial interaction in peripheral vision. Spat. Vis. 8, 255–280 (1994).

    Article  PubMed  Google Scholar 

  55. Nazir, T. A. Effects of lateral masking and spatial precueing on gap-resolution in central and peripheral vision. Vis. Res. 32, 771–777 (1992).

    Article  PubMed  Google Scholar 

  56. Andriessen, J. & Bouma, H. Eccentric vision: adverse interactions between line segments. Vis. Res. 16, 71–78 (1976).

    Article  PubMed  Google Scholar 

  57. Hariharan, S., Levi, D. M. & Klein, S. A. “Crowding” in normal and amblyopic vision assessed with Gaussian and Gabor C’s. Vis. Res. 45, 617–633 (2005).

    Article  PubMed  Google Scholar 

  58. Levi, D. M., Hariharan, S. & Klein, S. A. Suppressive and facilitatory spatial interactions in amblyopic vision. Vis. Res. 42, 1379–1394 (2002).

    Article  PubMed  Google Scholar 

  59. Banton, T. & Levi, D. M. Spatial localization of motion-defined and luminance-defined contours. Vis. Res. 33, 2225–2237 (1993).

    Article  PubMed  Google Scholar 

  60. Chakravarthi, R. & Cavanagh, P. Temporal properties of the polarity advantage effect in crowding. J. Vis. 7, 11 (2007).

    Article  PubMed  Google Scholar 

  61. Greenwood, J. A. & Parsons, M. J. Dissociable effects of visual crowding on the perception of color and motion. Proc. Natl Acad. Sci. USA 117, 8196–8202 (2020).

    Article  PubMed  PubMed Central  Google Scholar 

  62. Rosenholtz, R., Huang, J., Raj, A., Balas, B. J. & Ilie, L. A summary statistic representation in peripheral vision explains visual search. J. Vis. 12, 14 (2012). This paper introduces a visual search theory that fully incorporates the unique processing characteristics of peripheral vision to predict crowded visual search behaviour.

    Article  PubMed  Google Scholar 

  63. Popple, A. V. & Levi, D. M. The perception of spatial order at a glance. Vis. Res. 45, 1085–1090 (2005).

    Article  PubMed  Google Scholar 

  64. Strasburger, H. Unfocussed spatial attention underlies the crowding effect in indirect form vision. J. Vis. 5, 8 (2005).

    Article  Google Scholar 

  65. Zhang, X., Huang, J., Yigit-Elliott, S. & Rosenholtz, R. Cube search, revisited. J. Vis. 15, 9 (2015).

    Article  PubMed  PubMed Central  Google Scholar 

  66. Treisman, A. M. & Gelade, G. A feature-integration theory of attention. Cognit. Psychol. 12, 97–136 (1980). This paper proposes that focused attention is required for feature binding and the formation of object representations, based on feature search and conjunction search.

    Article  PubMed  Google Scholar 

  67. Eckstein, M. P., Beutter, B. R. & Stone, L. S. Quantifying the performance limits of human saccadic targeting during visual search. Perception 30, 1389–1401 (2001).

    Article  PubMed  Google Scholar 

  68. Klein, R. & Farrell, M. Search performance without eye movements. Percept. Psychophys. 46, 476–482 (1989).

    Article  PubMed  Google Scholar 

  69. Ng, G. J. P., Lleras, A. & Buetti, S. Fixed-target efficient search has logarithmic efficiency with and without eye movements. Atten. Percept. Psychophys. 80, 1752–1762 (2018).

    Article  PubMed  Google Scholar 

  70. Carrasco, M., McLean, T. L., Katz, S. M. & Frieder, K. S. Feature asymmetries in visual search: effects of display duration, target eccentricity, orientation and spatial frequency. Vis. Res. 38, 347–374 (1998).

    Article  PubMed  Google Scholar 

  71. Zelinsky, G. J. & Sheinberg, D. L. Eye movements during parallel–serial visual search. J. Exp. Psychol. Hum. Percept. Perform. 23, 244 (1997).

    Article  PubMed  Google Scholar 

  72. Carter, R. C. Visual search with color. J. Exp. Psychol. Hum. Percept. Perform. 8, 127 (1982).

    Article  PubMed  Google Scholar 

  73. Nagy, A. L. & Sanchez, R. R. Critical color differences determined with a visual search task. J. Opt. Soc. Am. A 7, 1209–1217 (1990).

    Article  PubMed  Google Scholar 

  74. Buetti, S., Cronin, D. A., Madison, A. M., Wang, Z. & Lleras, A. Towards a better understanding of parallel visual processing in human vision: evidence for exhaustive analysis of visual information. J. Exp. Psychol. Gen. 145, 672 (2016). This paper documents the search time costs associated with parallel search, which follow a logarithmic function of set size and track target–distractor similarity.

    Article  PubMed  Google Scholar 

  75. Lleras, A. et al. A target contrast signal theory of parallel processing in goal-directed search. Atten. Percept. Psychophys. 82, 394–425 (2020). This paper proposes that the logarithmic cost to simultaneously reject distractors is determined by target–distractor contrast in feature space.

    Article  PubMed  Google Scholar 

  76. Madison, A., Lleras, A. & Buetti, S. The role of crowding in parallel search: peripheral pooling is not responsible for logarithmic efficiency in parallel search. Atten. Percept. Psychophys. 80, 352–373 (2018).

    Article  PubMed  Google Scholar 

  77. Duncan, J. & Humphreys, G. W. Visual search and stimulus similarity. Psychol. Rev. 96, 433 (1989).

    Article  PubMed  Google Scholar 

  78. Egeth, H. E., Virzi, R. A. & Garbart, H. Searching for conjunctively defined targets. J. Exp. Psychol. Hum. Percept. Perform. 10, 32 (1984).

    Article  PubMed  Google Scholar 

  79. McLeod, P., Driver, J. & Crisp, J. Visual search for a conjunction of movement and form is parallel. Nature 332, 154–155 (1988).

    Article  PubMed  Google Scholar 

  80. Nakayama, K. & Silverman, G. H. Serial and parallel processing of visual feature conjunctions. Nature 320, 264–265 (1986).

    Article  PubMed  Google Scholar 

  81. Quinlan, P. T. & Humphreys, G. W. Visual search for targets defined by combinations of color, shape, and size: an examination of the task constraints on feature and conjunction searches. Percept. Psychophys. 41, 455–472 (1987).

    Article  PubMed  Google Scholar 

  82. Wolfe, J. M., Cave, K. R. & Franzel, S. L. Guided search: an alternative to the feature integration model for visual search. J. Exp. Psychol. Hum. Percept. Perform. 15, 419 (1989).

    Article  PubMed  Google Scholar 

  83. Townsend, J. T. & Ashby, F. G. Stochastic Modeling of Elementary Psychological Processes, 47–98 (CUP Archive, 1983). This paper is an early systematic investigation of various possible cognitive architectures underlying visual search.

  84. Desimone, R. & Duncan, J. Neural mechanisms of selective visual attention. Annu. Rev. Neurosci. 18, 193–222 (1995).

    Article  PubMed  Google Scholar 

  85. Rensink, R. A. The dynamic representation of scenes. Vis. Cognit. 7, 17–42 (2000).

    Article  Google Scholar 

  86. Buetti, S., Xu, J. & Lleras, A. Predicting how color and shape combine in the human visual system to direct attention. Sci. Rep. 9, 20258 (2019). This paper identifies the mathematical law by which colour and shape differences between targets and distractors are combined to determine the overall speed at which parallel search unfolds.

    Article  PubMed  PubMed Central  Google Scholar 

  87. Xu, Z. J., Lleras, A. & Buetti, S. Predicting how surface texture and shape combine in the human visual system to direct attention. Sci. Rep. 11, 6170 (2021).

    Article  PubMed  PubMed Central  Google Scholar 

  88. Henderson, J. M. & Hayes, T. R. Meaning-based guidance of attention in scenes as revealed by meaning maps. Nat. Hum. Behav. 1, 743–747 (2017).

    Article  PubMed  PubMed Central  Google Scholar 

  89. Torralba, A., Oliva, A., Castelhano, M. S. & Henderson, J. M. Contextual guidance of eye movements and attention in real-world scenes: the role of global features in object search. Psychol. Rev. 113, 766 (2006). This paper describes a model that predicts eye movements during search in real-world scenes.

    Article  PubMed  Google Scholar 

  90. Pereira, E. J. & Castelhano, M. S. Attentional capture is contingent on scene region: using surface guidance framework to explore attentional mechanisms during search. Psychon. Bull. Rev. 26, 1273–1281 (2019).

    Article  PubMed  Google Scholar 

  91. Neider, M. B. & Zelinsky, G. J. Scene context guides eye movements during visual search. Vis. Res. 46, 614–621 (2006).

    Article  PubMed  Google Scholar 

  92. Võ, M. L.-H. The meaning and structure of scenes. Vis. Res. 181, 10–20 (2021).

    Article  PubMed  Google Scholar 

  93. Greene, M. R. & Oliva, A. Recognition of natural scenes from global properties: seeing the forest without representing the trees. Cognit. Psychol. 58, 137–176 (2009).

    Article  PubMed  Google Scholar 

  94. Loffler, G., Gordon, G. E., Wilkinson, F., Goren, D. & Wilson, H. R. Configural masking of faces: evidence for high-level interactions in face perception. Vis. Res. 45, 2287–2297 (2005).

    Article  PubMed  Google Scholar 

  95. Loschky, L. C. et al. The importance of information localization in scene gist recognition. J. Exp. Psychol. Hum. Percept. Perform. 33, 1431 (2007).

    Article  PubMed  Google Scholar 

  96. Macé, M. J.-M., Joubert, O. R., Nespoulous, J.-L. & Fabre-Thorpe, M. The time-course of visual categorizations: you spot the animal faster than the bird. PLoS ONE 4, e5927 (2009).

    Article  PubMed  PubMed Central  Google Scholar 

  97. Potter, M. C. Meaning in visual search. Science 187, 965–966 (1975).

    Article  PubMed  Google Scholar 

  98. Potter, M. C. & Fox, L. F. Detecting and remembering simultaneous pictures in a rapid serial visual presentation. J. Exp. Psychol. Hum. Percept. Perform. 35, 28 (2009).

    Article  PubMed  PubMed Central  Google Scholar 

  99. Greene, M. R. & Oliva, A. The briefest of glances: the time course of natural scene understanding. Psychol. Sci. 20, 464–472 (2009).

    Article  PubMed  Google Scholar 

  100. Alvarez, G. A. & Oliva, A. The representation of simple ensemble visual features outside the focus of attention. Psychol. Sci. 19, 392–398 (2008).

    Article  PubMed  Google Scholar 

  101. Chong, S. C. & Treisman, A. Representation of statistical properties. Vis. Res. 43, 393–404 (2003).

    Article  PubMed  Google Scholar 

  102. Chong, S. C. & Treisman, A. Attentional spread in the statistical processing of visual displays. Percept. Psychophys. 67, 1–13 (2005).

    Article  PubMed  Google Scholar 

  103. Chong, S. C. & Treisman, A. Statistical processing: computing the average size in perceptual groups. Vis. Res. 45, 891–900 (2005).

    Article  PubMed  Google Scholar 

  104. Haberman, J. & Whitney, D. Seeing the mean: ensemble coding for sets of faces. J. Exp. Psychol. Hum. Percept. Perform. 35, 718 (2009).

    Article  PubMed  PubMed Central  Google Scholar 

  105. Wolfe, J. M., Võ, M. L.-H., Evans, K. K. & Greene, M. R. Visual search in scenes involves selective and nonselective pathways. Trends Cognit. Sci. 15, 77–84 (2011).

    Article  Google Scholar 

  106. Ehinger, K. A., Hidalgo-Sotelo, B., Torralba, A. & Oliva, A. Modelling search for people in 900 scenes: a combined source model of eye guidance. Vis. Cognit. 17, 945–978 (2009).

    Article  Google Scholar 

  107. Castelhano, M. S. & Heaven, C. The relative contribution of scene context and target features to visual search in scenes. Atten. Percept. Psychophys. 72, 1283–1297 (2010).

    Article  PubMed  Google Scholar 

  108. Droll, J. & Eckstein, M. Expected object position of two hundred fifty observers predicts first fixations of seventy seven separate observers during search. J. Vis. 8, 320 (2008).

    Article  Google Scholar 

  109. Mack, S. C. & Eckstein, M. P. Object co-occurrence serves as a contextual cue to guide and facilitate visual search in a natural viewing environment. J. Vis. 11, 9 (2011).

    Article  PubMed  Google Scholar 

  110. Andersen, S. & Müller, M. Behavioral performance follows the time course of neural facilitation and suppression during cued shifts of feature-selective attention. Proc. Natl Acad. Sci. USA 107, 13878–13882 (2010).

    Article  PubMed  PubMed Central  Google Scholar 

  111. Becker, S. I. Can intertrial effects of features and dimensions be explained by a single theory? J. Exp. Psychol. Hum. Percept. Perform. 34, 1417 (2008).

    Article  PubMed  Google Scholar 

  112. Becker, S. I. The role of target–distractor relationships in guiding attention and the eyes in visual search. J. Exp. Psychol. Gen. 139, 247 (2010).

    Article  PubMed  Google Scholar 

  113. Becker, S. I. Simply shapely: relative, not absolute shapes are primed in pop-out search. Atten. Percept. Psychophys. 75, 845–861 (2013).

    Article  PubMed  Google Scholar 

  114. Becker, S. I., Folk, C. L. & Remington, R. W. Attentional capture does not depend on feature similarity, but on target-nontarget relations. Psychol. Sci. 24, 634–647 (2013).

    Article  PubMed  Google Scholar 

  115. Becker, S. I., Harris, A. M., Venini, D. & Retell, J. D. Visual search for color and shape: when is the gaze guided by feature relationships, when by feature values? J. Exp. Psychol. Hum. Percept. Perform. 40, 264 (2014).

    Article  PubMed  Google Scholar 

  116. Becker, S. I., Harris, A. M., York, A. & Choi, J. Conjunction search is relational: behavioral and electrophysiological evidence. J. Exp. Psychol. Hum. Percept. Perform. 43, 1828 (2017).

    Article  PubMed  Google Scholar 

  117. Bundesen, C. A theory of visual attention. Psychol. Rev. 97, 523 (1990).

    Article  PubMed  Google Scholar 

  118. Folk, C. L., Remington, R. W. & Johnston, J. C. Involuntary covert orienting is contingent on attentional control settings. J. Exp. Psychol. Hum. Percept. Perform. 18, 1030 (1992).

    Article  PubMed  Google Scholar 

  119. Gaspelin, N. & Luck, S. J. Distinguishing among potential mechanisms of singleton suppression. J. Exp. Psychol. Hum. Percept. Perform. 44, 626 (2018).

    Article  PubMed  Google Scholar 

  120. Gaspelin, N. & Luck, S. J. Inhibition as a potential resolution to the attentional capture debate. Curr. Opin. Psychol. 29, 12 (2019).

    Article  PubMed  Google Scholar 

  121. Hillstrom, A. P. & Yantis, S. Visual motion and attentional capture. Percept. Psychophys. 55, 399–411 (1994).

    Article  PubMed  Google Scholar 

  122. Hoffman, J. E. A two-stage model of visual search. Percept. Psychophys. 25, 319–327 (1979).

    Article  PubMed  Google Scholar 

  123. Humphreys, G. W. & Muller, H. J. Search via recursive rejection (SERR): a connectionist model of visual search. Cognit. Psychol. 25, 43–110 (1993).

    Article  Google Scholar 

  124. Itti, L. & Koch, C. A saliency-based search mechanism for overt and covert shifts of visual attention. Vis. Res. 40, 1489–1506 (2000).

    Article  PubMed  Google Scholar 

  125. Itti, L., Koch, C. & Niebur, E. A model of saliency-based visual attention for rapid scene analysis. IEEE Trans. Pattern Anal. Mach. Intell. 20, 1254–1259 (1998).

    Article  Google Scholar 

  126. Liesefeld, H. R., Liesefeld, A. M., Pollmann, S. & Müller, H. J. in Processes of Visuospatial Attention and Working Memory (ed. Hodgson, T.) 87–113 (Springer, 2018).

  127. Liesefeld, H. R. & Müller, H. J. Distractor handling via dimension weighting. Curr. Opin. Psychol. 29, 160–167 (2019).

    Article  PubMed  Google Scholar 

  128. Navalpakkam, V. & Itti, L. Search goal tunes visual features optimally. Neuron 53, 605–617 (2007).

    Article  PubMed  Google Scholar 

  129. Sagi, D. & Julesz, B. Detection versus discrimination of visual orientation. Perception 13, 619–628 (1984).

    Article  PubMed  Google Scholar 

  130. Sagi, D. & Julesz, B. “Where” and” what” in vision. Science 228, 1217–1219 (1985).

    Article  PubMed  Google Scholar 

  131. Sawaki, R. & Luck, S. J. Capture versus suppression of attention by salient singletons: electrophysiological evidence for an automatic attend-to-me signal. Atten. Percept. Psychophys. 72, 1455–1470 (2010).

    Article  PubMed  PubMed Central  Google Scholar 

  132. Theeuwes, J. Cross-dimensional perceptual selectivity. Percept. Psychophys. 50, 184–193 (1991).

    Article  PubMed  Google Scholar 

  133. Theeuwes, J. Perceptual selectivity for color and form. Percept. Psychophys. 51, 599–606 (1992).

    Article  PubMed  Google Scholar 

  134. Wolfe, J. M. Guided search 2.0 a revised model of visual search. Psychon. Bull. Rev. 1, 202–238 (1994).

    Article  PubMed  Google Scholar 

  135. Yantis, S. & Hillstrom, A. P. Stimulus-driven attentional capture: evidence from equiluminant visual objects. J. Exp. Psychol. Hum. Percept. Perform. 20, 95 (1994).

    Article  PubMed  Google Scholar 

  136. Yu, X. & Geng, J. J. The attentional template is shifted and asymmetrically sharpened by distractor context. J. Exp. Psychol. Hum. Percept. Perform. 45, 336 (2019).

    Article  PubMed  PubMed Central  Google Scholar 

  137. Ullman, S. in Readings in Computer Vision (eds Fischler, M. A. & Firschein, O.) 298–328 (Elsevier, 1987).

  138. Adeli, H., Vitu, F. & Zelinsky, G. J. A model of the superior colliculus predicts fixation locations during scene viewing and visual search. J. Neurosci. 37, 1453–1467 (2017).

    Article  PubMed  PubMed Central  Google Scholar 

  139. Palmer, J. & Wright, R. Attentional effects in visual search: relating search accuracy and search time. Vis. Atten. 8, 348–388 (1998).

    Google Scholar 

  140. Eckstein, M. P., Thomas, J. P., Palmer, J. & Shimozaki, S. S. A signal detection model predicts the effects of set size on visual search accuracy for feature, conjunction, triple conjunction, and disjunction displays. Percept. Psychophys. 62, 425–451 (2000).

    Article  PubMed  Google Scholar 

  141. Najemnik, J. & Geisler, W. S. Optimal eye movement strategies in visual search. Nature 434, 387–391 (2005). This paper presents a computational model of visual search that compares human performance with an ideal observer.

    Article  PubMed  Google Scholar 

  142. Palmer, J. Set-size effects in visual search: the effect of attention is independent of the stimulus for simple tasks. Vis. Res. 34, 1703–1721 (1994).

    Article  PubMed  Google Scholar 

  143. Palmer, J. Attention in visual search: distinguishing four causes of a set-size effect. Curr. Dir. Psychol. Sci. 4, 118–123 (1995).

    Article  Google Scholar 

  144. Palmer, J., Ames, C. T. & Lindsey, D. T. Measuring the effect of attention on simple visual search. J. Exp. Psychol. Hum. Percept. Perform. 19, 108 (1993).

    Article  PubMed  Google Scholar 

  145. Palmer, J., Verghese, P. & Pavel, M. The psychophysics of visual search. Vis. Res. 40, 1227–1268 (2000).

    Article  PubMed  Google Scholar 

  146. Verghese, P. Visual search and attention: a signal detection theory approach. Neuron 31, 523–535 (2001).

    Article  PubMed  Google Scholar 

  147. Zelinsky, G. J. A theory of eye movements during target acquisition. Psychol. Rev. 115, 787 (2008). This paper describes a visual search theory focused on predicting the sequence of eye movements required to find a target in a scene.

    Article  PubMed  PubMed Central  Google Scholar 

  148. Hulleman, J. & Olivers, C. N. The impending demise of the item in visual search. Behav. Brain Sci. 40, e132 (2017). This paper proposes that observers can differentiate targets from distractors only inside a functional viewing field around fixation.

    Article  PubMed  Google Scholar 

  149. Sanders, A. F. Some aspects of the selective process in the functional visual field. Ergonomics 13, 101–117 (1970).

    Article  PubMed  Google Scholar 

  150. Wolfe, J. M. Guided Search 6.0: an updated model of visual search. Psychon. Bull. Rev. 28, 1060–1092 (2021). This paper presents a comprehensive model of visual search that combines the concepts of functional viewing fields and scene information with feature-based guidance.

    Article  PubMed  PubMed Central  Google Scholar 

  151. Ng, G. J., Buetti, S., Patel, T. N. & Lleras, A. Prioritization in visual attention does not work the way you think it does. J. Exp. Psychol. Hum. Percept. Perform. 47, 252–268 (2020).

    Article  PubMed  Google Scholar 

  152. Heaton, R., Hummel, J., Lleras, A. & Buetti, S. A computational account of serial and parallel processing in visual search. J. Vis. 20, 844 (2020).

    Article  Google Scholar 

  153. Balas, B., Nakano, L. & Rosenholtz, R. A summary-statistic representation in peripheral vision explains visual crowding. J. Vis. 9, 13 (2009).

    Article  PubMed  Google Scholar 

  154. Young, A. H. & Hulleman, J. Eye movements reveal how task difficulty moulds visual search. J. Exp. Psychol. Hum. Percept. Perform. 39, 168 (2013).

    Article  PubMed  Google Scholar 

  155. Chang, H. & Rosenholtz, R. Search performance is better predicted by tileability than presence of a unique basic feature. J. Vis. 16, 13 (2016).

    Article  PubMed  PubMed Central  Google Scholar 

  156. Veríssimo, I. S., Hölsken, S. & Olivers, C. N. Individual differences in crowding predict visual search performance. J. Vis. 21, 29–29 (2021).

    Article  PubMed  PubMed Central  Google Scholar 

  157. Lleras, A. et al. Predicting search performance in heterogeneous scenes: quantifying the impact of homogeneity effects in efficient search. Collabra Psychol. 5, 151 (2019).

    Article  Google Scholar 

  158. Wang, Z., Buetti, S. & Lleras, A. Predicting search performance in heterogeneous visual search scenes with real-world objects. Collabra Psychol. 3, 53 (2017).

    Article  Google Scholar 

  159. Xu, Z. J., Lleras, A., Shao, Y. & Buetti, S. Distractor–distractor interactions in visual search for oriented targets explain the increased difficulty observed in nonlinearly separable conditions. J. Exp. Psychol. Hum. Percept. Perform. 47, 1274 (2021).

    Article  PubMed  Google Scholar 

Download references

Acknowledgements

The authors thank R. Rosenholtz for her feedback on the first draft of the manuscript, as well as the National Science Foundation for partially supporting this project under grant no. BCS1921735 to S.B.

Author information

Authors and Affiliations

Authors

Contributions

All authors researched data for the article and contributed substantially to discussion of the content. A.L. and S.B. wrote the article. All authors reviewed and/or edited the manuscript before submission.

Corresponding authors

Correspondence to Alejandro Lleras or Simona Buetti.

Ethics declarations

Competing interests

The authors declare no competing interests.

Peer review

Peer review information

Nature Reviews Psychology thanks Jeremy Wolfe and the other, anonymous, reviewer(s) for their contribution to the peer review of this work.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Glossary

Visual field

The extent that can be seen with the eyes at a given fixation point, including fovea and periphery.

Fovea

The area of the retina that processes directly fixated information, with a width defined between 1.7 and 5 degrees of visual angle.

Rights and permissions

Springer Nature or its licensor holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Lleras, A., Buetti, S. & Xu, Z.J. Incorporating the properties of peripheral vision into theories of visual search. Nat Rev Psychol 1, 590–604 (2022). https://doi.org/10.1038/s44159-022-00097-1

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1038/s44159-022-00097-1

Search

Quick links

Nature Briefing

Sign up for the Nature Briefing newsletter — what matters in science, free to your inbox daily.

Get the most important science stories of the day, free in your inbox. Sign up for Nature Briefing