Abstract
Reliable measurement of spontaneous and evoked eye movement is critical for behavioral vision research. Zebrafish are increasingly used as a model organism for visual neural circuits, but ready-to-use eye-tracking solutions are scarce. Here, we present a protocol for automated real-time measurement of angular horizontal eye position in up to six immobilized larval fish using a custom-built LabVIEW-based software, ZebEyeTrack. We provide its customizable source code, as well as a streamlined and compiled version, ZebEyeTrack Light. The full version of ZebEyeTrack controls all required hardware and synchronizes six essential aspects of the experiment: (i) stimulus design; (ii) visual stimulation with moving bars; (iii) eye detection and tracking, as well as general motion detection; (iv) real-time analysis; (v) eye-position-dependent closed-loop event control; and (vi) recording of external event times. This includes optional integration with external hardware such as lasers and scanning microscopes. Once installation is complete, experiments, including stimulus design, can be completed in <10 min, and recordings can last anywhere between seconds and many hours. Results include digitized angular eye positions and hardware status, which can be used to compute tuning curves, optokinetic gain, and other custom data analysis. After the experiment, or based on existing videos, optokinetic response (OKR) performance can be analyzed semi-automatically via the graphical user interface, and results can be exported. ZebEyeTrack has been used successfully for psychophysics experiments, for optogenetic stimulation, and in combination with calcium imaging.
This is a preview of subscription content, access via your institution
Relevant articles
Open Access articles citing this article.
-
Effects of air pollution exposure on social behavior: a synthesis and call for research
Environmental Health Open Access 25 June 2021
-
A distributed saccade-associated network encodes high velocity conjugate and monocular eye movements in the zebrafish hindbrain
Scientific Reports Open Access 16 June 2021
-
Functional architecture underlying binocular coordination of eye position and velocity in the larval zebrafish hindbrain
BMC Biology Open Access 29 December 2019
Access options
Access Nature and 54 other Nature Portfolio journals
Get Nature+, our best-value online-access subscription
$29.99 / 30 days
cancel any time
Subscribe to this journal
Receive 12 print issues and online access
$259.00 per year
only $21.58 per issue
Rent or buy this article
Prices vary by article type
from$1.95
to$39.95
Prices may be subject to local taxes which are calculated during checkout







Change history
21 August 2018
The version of this paper originally published contained the following text errors: (1) In the abstract, “(ii) visual stimulation with moving bars; (ii) eye detection and tracking, as well as general motion detection” should have been “(ii) visual stimulation with moving bars; (iii) eye detection and tracking, as well as general motion detection.” (2) In the legend for Table 1, “vertical pixel coordinate; LE, left eye; RE, right eye; x, horizontal pixel coordinate; y” should have read “LE, left eye; RE, right eye; x, horizontal pixel coordinate; y, vertical pixel coordinate.” These errors have been corrected in the HTML and PDF versions of the paper.
References
van Alphen, A. M., Stahl, J. S. & De Zeeuw, C. I. The dynamic characteristics of the mouse horizontal vestibulo-ocular and optokinetic response. Brain Res. 890, 296–305 (2001).
Faulstich, B. M., Onori, K. A. & du Lac, S. Comparison of plasticity and development of mouse optokinetic and vestibulo-ocular reflexes suggests differential gain control mechanisms. Vision Res. 44, 3419–3427 (2004).
Cahill, H. & Nathans, J. The optokinetic reflex as a tool for quantitative analyses of nervous system function in mice: application to genetic and drug-induced variation. PLoS ONE 3, e2055 (2008).
Land, M. F. Eye movements of vertebrates and their relation to eye form and function. J. Comp. Physiol. A 201, 195–214 (2015).
Easter, S. S. Jr & Nicola, G. N. The development of vision in the zebrafish (Danio rerio). Dev. Biol. 180, 646–663 (1996).
Roeser, T. & Baier, H. Visuomotor behaviors in larval zebrafish after GFP-guided laser ablation of the optic tectum. J. Neurosci. 23, 3726–3734 (2003).
Brockerhoff, S. E. et al. A behavioral screen for isolating zebrafish mutants with visual system defects. Proc. Natl. Acad. Sci. USA 92, 10545–10549 (1995).
Neuhauss, S. C. F. et al. Genetic disorders of vision revealed by a behavioral screen of 400 essential loci in zebrafish. J. Neurosci. 19, 8603–8615 (1999).
Cameron, D. J. et al. The optokinetic response as a quantitative measure of visual acuity on zebrafish. J. Vis. Exp. 80, e50832 (2013).
Distler, C., Vital-Durand, F., Korte, R., Korbmacher, H. & Hoffmann, K.-P. Development of the optokinetic system in macaque monkeys. Vision Res. 39, 3909–3919 (1999).
Distler, C. & Hoffmann, K.-P. Development of the optokinetic response in macaques: a comparison to cat and man. Ann. N. Y Acad. Sci. 1004, 10–18 (2003).
Muto, A. et al. Forward genetic analysis of visual behavior in zebrafish. PLoS Genet. 1, 0575–0588 (2005).
Brockerhoff, S. E. Measuring the optokinetic response of zebrafish larvae. Nat. Protoc. 1, 2448–2451 (2006).
Schoonheim, P. J., Arrenberg, A. B., Del Bene, F. & Baier, H. Optogenetic localization and genetic perturbation of saccade-generating neurons in zebrafish. J. Neurosci. 30, 7111–7120 (2010).
Bianco, I. H. et al. The tangential nucleus controls a gravito-inertial vestibulo-ocular reflex. Curr. Biol. 22, 1285–1295 (2012).
Portugues, R., Feierstein, C. E., Engert, F. & Orger, M. B. Whole-brain activity maps reveal stereotyped, distributed networks for visuomotor behavior. Neuron 81, 1328–1343 (2014).
Kubo, F. et al. Functional architecture of an optic flow-responsive area that drives horizontal eye movements in zebrafish. Neuron 81, 1344–1359 (2014).
Mueller, K. P. & Neuhauss, S. C. F. Quantitative measurements of the optokinetic response in adult fish. J. Neurosci. Methods 186, 29–34 (2010).
Mueller, K. P., Schnaedelbach, O. D. R., Russig, H. D. & Neuhauss, S. C. F. VisioTracker, an innovative automated approach to oculomotor analysis. J. Vis. Exp. 56, e3556 (2011).
Arrenberg, A. B. in Zebrafish: Methods and Protocols (Methods in Molecular Biology 1451), Fiber oOptic-based Photostimulation of Larval Zebrafish 343–354 (Springer Science+Business Media 2016).
Arrenberg, A., Del Bene, F. & Baier, H. Optical control of zebrafish behavior with halorhodopsin. Proc. Natl. Acad. Sci. USA 106, 17968–73 (2009).
Gonçalves, P. J., Arrenberg, A. B., Hablitzel, B., Baier, H. & Machens, C. K. Optogenetic perturbations reveal the dynamics of an oculomotor integrator. Front. Neural Circuits 8, 1–22 (2014).
Miri, A. et al. Spatial gradients and multidimensional dynamics in a neural integrator circuit. Nat. Neurosci. 14, 1150–1159 (2011).
Reinig, S., Driever, W. & Arrenberg, A. B. The descending diencephalic dopamine system is tuned to sensory stimuli. Curr. Biol. 27, 1–16 (2017).
Thiele, T. R., Donovan, J. C. & Baier, H. Descending control of swim posture by a midbrain nucleus in zebrafish. Neuron 83, 1–13 (2014).
Clark, D. Visual Responses in the Developing Zebrafish (Brachydanio rerio). PhD thesis (University of Oregon Press, 1981).
Huber-Reggi, S. P., Mueller, K. P. & Neuhauss, S. C. in Retinal Degeneration: Methods and Protocols (Methods in Molecular Biology 935) (eds. Weber, B. H. & Langmann, T.) 139–160 (Springer Science+Business Media, 2013).
Kretschmer, F., Kretschmer, V., Kunze, V. P. & Kretzberg, J. OMR-Arena: automated measurement and stimulation system to determine mouse visual thresholds based on optomotor responses. PLoS ONE 8, e78058 (2013).
Miri, A., Daie, K., Burdine, R. D., Aksay, E. & Tank, D. W. Regression-based identification of behavior-encoding neurons during large-scale optical imaging of neural activity at cellular resolution. J. Neurophysiol. 105, 964–980 (2011).
Brysch, C., Leyden, C. & Arrenberg, A. An investigation of the neuronal tuning to horizontal eye movements in the oculomotor system of larval zebrafish. Program no. 150. 14. Neuroscience Meeting Planner: Society for Neuroscience (Washington, DC, 2017).
Reiff, D. F., Plett, J., Mank, M., Griesbeck, O. & Borst, A. Visualizing retinotopic half-wave rectified input to the motion detection circuitry of Drosophila. Nat. Neurosci. 13, 973–978 (2010).
Branchek, T. The development of photoreceptors in the zebrafish, Brachydanio rerio. II. function. J. Comp. Neurol. 224, 116–122 (1984).
Brainard, D. The psychophysics toolbox. Spat. Vis. 10, 433–436 (1997).
Pelli, D. The VideoToolbox software for visual psychophysics: transforming numbers into movies. Spatial Vision 10, 437–442 (1997).
Kleiner, M., Brainard, D. & Pelli, D. What’s new in Psychtoolbox-3? Perception 36 ECVP Abstract Supplement (Arezzo, Italy, 2007).
To, L., Woods, R. L., Goldstein, R. B. & Peli, E. Psychophysical contrast calibration. Vis. Res. 90, 15–24 (2013).
Hablitzel, B. Diploma Thesis (Freiburg University, 2012).
Schindelin, J. et al. Fiji: an open-source platform for biological-image analysis. Nat. Methods 9, 676–682 (2012).
Lopes, G. et al. Bonsai: an event-based framework for processing and controlling data streams. Front. Neuroinform 9, 7 (2015).
Acknowledgements
ZebEyeTrack is based on a precursory application that was developed by A.B.A. in the laboratory of H. Baier (UCSF) and refined in the laboratory of W. Driever (Freiburg University). Parts of the MATLAB Psychtoolbox scripts for running visual stimuli were based on pre-existing scripts provided by M.B. Orger (Baier Lab, UCSF). We thank L. Ziv-Strasser, D. Strasser, and J. Huisken for teaching A.B.A. how to use LabVIEW at UCSF. We thank B. Hablitzel (Freiburg University) for helping to develop the custom LED arena, a pilot benchmarking of eye-tracking performance, and for technical assistance. We also thank G. Lopes (Kampff Lab, UCL) for helpful conversations on software development. At the Arrenberg lab, C. Brysch tested and provided feedback on our software; S. Buss and R. Meier assisted with preparations for software testing and benchmarking. This work was supported by Deutsche Forschungsgemeinschaft (DFG) grants EXC307 (CIN-Interdisciplinary Centre for Integrative Neuroscience) and INST 37/967-1 FUGG, as well as a Juniorprofessor programme grant from the Ministry of Science, Research, and the Arts of the State of Baden-Württemberg (MWK).
Author information
Authors and Affiliations
Contributions
A.B.A. conceived the software and experiments and wrote a precursory version of the software. F.A.D., C.L., and A.B.A. wrote the code for ZebEyeTrack. F.A.D. and A.v.D. ensured compatibility across operating systems. A.v.D. set up software repositories. F.A.D and A.B.A. wrote the manuscript. F.A.D. designed the user interface, performed experiments, and created the figures.
Corresponding author
Ethics declarations
Competing interests
The authors declare no competing interests.
Additional information
Publisher’s note: Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Related links
Functional architecture of an optic flow-responsive area that drives horizontal eye movements in zebrafish: https://doi.org/10.1016/j.neuron.2014.02.043
Optogenetic Localization and Genetic Perturbation of Saccade-Generating Neurons in Zebrafish: https://doi.org/10.1523/JNEUROSCI.5193-09.2010
Integrated supplementary information
Supplementary Figure 1 Detection of tail motion events, e.g., to trigger optogenetic stimulation.
Top left: Users define a ROI within live or recorded video, through which they expect the tail to pass. Top right: For each frame, ZebEyeTrack computes the median intensity of pixels within this ROI. When the median intensity or its rate of change crosses a user-defined threshold, as is the case when the tail moves through the ROI, this event is detected. Bottom left: Users can choose between types of threshold and set the threshold value. Detected events then trigger the desired response: If the user has disabled analog output, none is generated, but the tail motion event is still recorded to the eye-tracking data file, as nonzero values are written to the columns representing analog output (e.g., columns 2–4 if a single fish is tracked). If the user has enabled analog output, it is generated in the same way as for saccade-triggered events. Both types of triggers are allowed in parallel. Bottom right: If desired, the time course of ROI pixel intensity can be included in an additional, second-to-last column of the main eye-tracking data file. Appropriate regulatory board permission was obtained before our zebrafish experiments (Reagent setup)
Supplementary Figure 2 Setup for an oculomotor experiment using two-photon microscopy.
(a) Overview of the setup. Zebrafish larvae are placed under the objective of a moveable-objective microscope, into the center of a stimulus arena made up of two pairs of custom LED arrays. These arrays consist of red LEDs arranged in columns and covered by diffusers to create a more homogeneous bar stimulus. The eyes are detected by a CCD camera mounted below the LED arena. (b) Optical path and data flow. Central column: Excitatory light from a two-photon laser is coupled into the microscope and focused onto the fish brain, scanning across a defined area and exciting the fluorescent calcium indicators expressed in the fish. Fluorescence is detected by one photomultiplier each for red and green wavelengths. Images are reconstructed from green light. The fish is further illuminated by infrared (IR) light detected by a CCD camera below. This light is emitted by two light sources, a central IR LED coupled into the optical path of the two-photon laser by a dichroic mirror, and a peripheral IR LED ring with a central aperture for laser light to pass through. Bottom right: columns of individual LEDs make up the stimulus arena. Each column is assigned to one of four groups, and all LEDs in one group are either on or off at the same time, generating a moving-bar stimulus. Left column: fish eyes are detected as the darkest contiguous video pixels recorded by the CCD camera, and their orientation is tracked (Fig. 5) and correlated with the scanning mirror position of the two-photon microscope. Parts of this figure were adapted with permission from Bastian Hablitzel’s diploma thesis37. Appropriate regulatory board permission was obtained before our zebrafish experiments
Supplementary information
Supplementary Information
Supplementary Figures 1 and 2, Supplementary Tables 1 and 2, and Supplementary Manuals 1–3
Supplementary Data 1
Eye-tracking data file acquired from Supplementary Video 1
Supplementary Data 2
Eye-tracking data file acquired from Supplementary Video 2
Supplementary Data 3
LED line scan stimulus protocol file
Supplementary Data 4
LED line scan stimulus protocol file
Supplementary Data 5
Psychtoolbox stimulus protocol file. This stimulus protocol was used to stimulate the fish tracked for Fig. 8c
Supplementary Data 6
Eye-tracking data file covering multiple stimulus phases, to test post hoc analysis. The stimulus protocol used was similar to the one in Supplementary Data 5. This data file contains accurate tracking data for one fish (fish no. 1), and inaccurate data for five others (e.g., because of ill-defined body axes, faulty embedding, or other experimental caveats)
Supplementary Data 7
Expected main result file from post hoc analysis of Supplementary Data 5 (cf. Table 2)
Supplementary Data 8
Expected single-phase result file from post hoc analysis of Supplementary Data 5 (cf. Table 3)
Supplementary Data 9
Expected single-phase result file from post hoc analysis of Supplementary Data 5 (cf. Table 3)
Supplementary Video 1
Recording of a single embedded larva observing visual stimulation via Psychtoolbox stimulus screens. The fish initially displays two spontaneous binocular saccades. After 20 s, visual stimulation is present using moving bars, and the fish exhibits OKR. Motion JPEG compressed the video. The uncompressed video is available via http://www.zebeyetrack.org/videos/video_1.avi
Supplementary Video 2
Recording of a single embedded larva observing visual stimulation via Psychtoolbox stimulus screens. Spontaneous eye movement only. Motion JPEG compressed video. The uncompressed video is available via http://www.zebeyetrack.org/videos/video_2.avi
Rights and permissions
About this article
Cite this article
Dehmelt, F.A., von Daranyi, A., Leyden, C. et al. Evoking and tracking zebrafish eye movement in multiple larvae with ZebEyeTrack. Nat Protoc 13, 1539–1568 (2018). https://doi.org/10.1038/s41596-018-0002-0
Published:
Issue Date:
DOI: https://doi.org/10.1038/s41596-018-0002-0
This article is cited by
-
Effects of air pollution exposure on social behavior: a synthesis and call for research
Environmental Health (2021)
-
A distributed saccade-associated network encodes high velocity conjugate and monocular eye movements in the zebrafish hindbrain
Scientific Reports (2021)
-
Functional architecture underlying binocular coordination of eye position and velocity in the larval zebrafish hindbrain
BMC Biology (2019)
Comments
By submitting a comment you agree to abide by our Terms and Community Guidelines. If you find something abusive or that does not comply with our terms or guidelines please flag it as inappropriate.