Abstract
To perform visual search, humans, like many mammals, encode a large field of view with retinas having variable spatial resolution, and then use high-speed eye movements to direct the highest-resolution region, the fovea, towards potential target locations1,2. Good search performance is essential for survival, and hence mammals may have evolved efficient strategies for selecting fixation locations. Here we address two questions: what are the optimal eye movement strategies for a foveated visual system faced with the problem of finding a target in a cluttered environment, and do humans employ optimal eye movement strategies during a search? We derive the ideal bayesian observer3,4,5,6 for search tasks in which a target is embedded at an unknown location within a random background that has the spectral characteristics of natural scenes7. Our ideal searcher uses precise knowledge about the statistics of the scenes in which the target is embedded, and about its own visual system, to make eye movements that gain the most information about target location. We find that humans achieve nearly optimal search performance, even though humans integrate information poorly across fixations8,9,10. Analysis of the ideal searcher reveals that there is little benefit from perfect integration across fixations—much more important is efficient processing of information on each fixation. Apparently, evolution has exploited this fact to achieve efficient eye movement strategies with minimal neural resources devoted to memory.
This is a preview of subscription content, access via your institution
Access options
Subscribe to this journal
Receive 51 print issues and online access
$199.00 per year
only $3.90 per issue
Buy this article
- Purchase on SpringerLink
- Instant access to full article PDF
Prices may be subject to local taxes which are calculated during checkout
Similar content being viewed by others
References
Carpenter, R. H. S. (ed.) Eye Movements (Macmillan, London, 1991)
Liversedge, S. P. & Findley, J. M. Saccadic eye movements and cognition. Trends Cogn. Sci. 4, 6–14 (2000)
Green, D. M. & Swets, J. A. Signal Detection Theory and Psychophysics (Wiley, New York, 1966)
Burgess, A. E. & Ghandeharian, H. Visual signal detection. II. Effect of signal-location identification. J. Opt. Soc. Am. A 1, 906–910 (1984)
Geisler, W. S. & Diehl, R. L. A Bayesian approach to the evolution of perceptual and cognitive systems. Cogn. Sci. 27, 379–402 (2003)
Kersten, D., Mamassian, P. & Yuille, A. L. Object perception as Bayesian inference. Annu. Rev. Psychol. 55, 271–304 (2004)
Field, D. J. Relations between the statistics of natural images and the response properties of cortical cells. J. Opt. Soc. Am. A 4, 2379–2394 (1987)
Irwin, D. E. Information integration across saccadic eye movement. Cognit. Psychol. 23, 420–458 (1991)
Hayhoe, M. M., Bensinger, D. G. & Ballard, D. H. Task constraints in visual working memory. Vision Res. 38, 125–137 (1998)
Rensink, R. A. Change detection. Annu. Rev. Psychol. 53, 245–277 (2002)
Palmer, J., Verghese, P. & Pavel, M. The psychophysics of visual search. Vision Res. 40, 1227–1268 (2000)
Wolfe, J. M. in Attention (ed. Pashler, H.) 13–74 (Psychology Press, Hove, East Sussex, 1998)
Schall, J. D. in The Visual Neurosciences (eds Chalupa, L. M. & Werner, J. S.) 1369–1390 (MIT Press, Cambridge, Massachusetts, 2004)
Blake, A. & Yuille, A. L. (eds) Active Vision (MIT Press, Cambridge, Massachusetts, 1992)
Burgess, A. E., Wagner, R. F., Jennings, R. J. & Barlow, H. B. Efficiency of human visual signal discrimination. Science 214, 93–94 (1981)
Pelli, D. G. & Farell, B. Why use noise? J. Opt. Soc. Am. A 16, 647–653 (1999)
Lu, Z.-L. & Dosher, B. A. Characterizing human perceptual inefficiencies with equivalent internal noise. J. Opt. Soc. Am. A 16, 764–778 (1999)
Geman, D. & Jedynak, B. An active testing model for tracking roads in satellite images. IEEE Trans. Pattern Anal. Mach. Intell. 18, 1–14 (1996)
Rajashekar, U., Cormack, L. K. & Bovik, A. C. in Eye Tracking Research & Applications (ed. Duchowski, A. T.) 119–123 (ACM SIGGRAPH, New Orleans, 2002)
Findley, J. M. Global processing for saccadic eye movements. Vision Res. 22, 1033–1045 (1982)
Zelinsky, G. J., Rao, R. P., Hayhoe, M. M. & Ballard, D. H. Eye movements reveal the spatio-temporal dynamics of visual search. Psychol. Sci. 8, 448–453 (1997)
Legge, G. E., Hooven, T. A., Klitz, T. S., Mansfield, J. G. & Tjan, B. S. Mr. Chips 2002: New insights from an ideal observer model of reading. Vision Res. 42, 2219–2234 (2002)
Eckstein, M. P., Beutter, B. R. & Stone, L. S. Quantifying the performance limits of human saccadic targeting during visual search. Perception 30, 1389–1401 (2001)
Eckstein, M. P., Thomas, J. P., Palmer, J. & Shimozaki, S. S. A signal detection model predicts the effects of set size on visual search accuracy for feature, conjunction, triple conjunction, and disjunction displays. Percept. Psychophys. 62, 425–451 (2000)
Itti, L. & Koch, C. A saliency-based search mechanism for overt and covert shifts of visual attention. Vision Res. 40, 11–46 (2000)
Rao, R. P. N., Zelinsky, G. J., Hayhoe, M. M. & Ballard, D. H. Eye movements in iconic visual search. Vision Res. 42, 1447–1463 (2002)
Acknowledgements
We thank R.F. Murray for helpful discussions, and J. Perry, L. Stern and C. Creeger for technical assistance. This work was supported by the National Eye Institute, NIH.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Competing interests
The authors declare that they have no competing financial interests.
Supplementary information
Supplementary Equations
This document contains a derivation of the ideal searcher for the case of dynamic (temporally uncorrelated) external and internal noise, and describes the ideal searcher for the case of static (temporally correlated) external noise and dynamic (temporally uncorrelated) internal noise. (PDF 181 kb)
Rights and permissions
About this article
Cite this article
Najemnik, J., Geisler, W. Optimal eye movement strategies in visual search. Nature 434, 387–391 (2005). https://doi.org/10.1038/nature03390
Received:
Accepted:
Issue Date:
DOI: https://doi.org/10.1038/nature03390
This article is cited by
-
The Relationship Between Environmental Statistics and Predictive Gaze Behaviour During a Manual Interception Task: Eye Movements as Active Inference
Computational Brain & Behavior (2024)
-
Visuospatial information foraging describes search behavior in learning latent environmental features
Scientific Reports (2023)
-
Modeling Eye Movements During Decision Making: A Review
Psychometrika (2023)
-
Feature Attention as a Control Mechanism for the Balance of Speed and Accuracy in Visual Search
Computational Brain & Behavior (2023)
-
Humans represent the precision and utility of information acquired across fixations
Scientific Reports (2022)
Comments
By submitting a comment you agree to abide by our Terms and Community Guidelines. If you find something abusive or that does not comply with our terms or guidelines please flag it as inappropriate.