Skip to main content

Thank you for visiting You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

  • Article
  • Published:

Internal models direct dragonfly interception steering


Sensorimotor control in vertebrates relies on internal models. When extending an arm to reach for an object, the brain uses predictive models of both limb dynamics and target properties. Whether invertebrates use such models remains unclear. Here we examine to what extent prey interception by dragonflies (Plathemis lydia), a behaviour analogous to targeted reaching, requires internal models. By simultaneously tracking the position and orientation of a dragonfly’s head and body during flight, we provide evidence that interception steering is driven by forward and inverse models of dragonfly body dynamics and by models of prey motion. Predictive rotations of the dragonfly’s head continuously track the prey’s angular position. The head–body angles established by prey tracking appear to guide systematic rotations of the dragonfly’s body to align it with the prey’s flight path. Model-driven control thus underlies the bulk of interception steering manoeuvres, while vision is used for reactions to unexpected prey movements. These findings illuminate the computational sophistication with which insects construct behaviour.

This is a preview of subscription content, access via your institution

Access options

Buy this article

Prices may be subject to local taxes which are calculated during checkout

Figure 1: Parallel navigation and reactive control are insufficient to describe interception steering.
Figure 2: Free-flight measurement of head and body orientation during prey interception.
Figure 3: Body dynamics constrain the interception steering strategy.
Figure 4: Decomposition of prey-image drift into its sources reveals cancellation.
Figure 5: Head movements predictively compensate for disturbances to foveation.

Similar content being viewed by others


  1. Franklin, D. W. & Wolpert, D. M. Computational mechanisms of sensorimotor control. Neuron 72, 425–442 (2011)

    Article  CAS  Google Scholar 

  2. Zago, M. et al. Internal models of target motion: expected dynamics overrides measured kinematics in timing manual interceptions. J. Neurophysiol. 91, 1620–1634 (2004)

    Article  Google Scholar 

  3. Flanagan, J. R., King, S., Wolpert, D. M. & Johansson, R. S. Sensorimotor prediction and memory in object manipulation. Can. J. Exp. Psychol. 55, 87–95 (2001)

    Article  CAS  Google Scholar 

  4. Kawato, M. Internal models for motor control and trajectory planning. Curr. Opin. Neurobiol. 9, 718–727 (1999)

    Article  CAS  Google Scholar 

  5. Wolpert, D. M., Ghahramani, Z. & Jordan, M. I. An internal model for sensorimotor integration. Science 269, 1880–1882 (1995)

    Article  ADS  CAS  Google Scholar 

  6. Mehta, B. & Schaal, S. Forward models in visuomotor control. J. Neurophysiol. 88, 942–953 (2002)

    Article  Google Scholar 

  7. Huston, S. J. & Jayaraman, V. Studying sensorimotor integration in insects. Curr. Opin. Neurobiol. 21, 527–534 (2011)

    Article  CAS  Google Scholar 

  8. Webb, B. Neural mechanisms for prediction: do insects have forward models? Trends Neurosci. 27, 278–282 (2004)

    Article  CAS  Google Scholar 

  9. Olberg, R. M., Worthington, A. H. & Venator, K. R. Prey pursuit and interception in dragonflies. J. Comp. Physiol. A 186, 155–162 (2000)

    Article  CAS  Google Scholar 

  10. Olberg, R. M., Seaman, R. C., Coats, M. I. & Henry, A. F. Eye movements and target fixation during dragonfly prey-interception flights. J. Comp. Physiol. A 193, 685–693 (2007)

    Article  CAS  Google Scholar 

  11. Gonzalez-Bellido, P. T., Peng, H., Yang, J., Georgopoulos, A. P. & Olberg, R. M. Eight pairs of descending visual neurons in the dragonfly give wing motor centers accurate population vector of prey direction. Proc. Natl Acad. Sci. USA 110, 696–701 (2013)

    Article  ADS  CAS  Google Scholar 

  12. Justh, E. W. & Krishnaprasad, P. S. Steering laws for motion camouflage. Proc. R. Soc. A 462, 3629–3643 (2006)

    Article  ADS  MathSciNet  Google Scholar 

  13. Shneydor, N. A. Missile Guidance and Pursuit: Kinematics, Dynamics and Control (Elsevier, 1998)

    Book  Google Scholar 

  14. Ghose, K., Horiuchi, T. K., Krishnaprasad, P. S. & Moss, C. F. Echolocating bats use a nearly time-optimal strategy to intercept prey. PLoS Biol. 4, e108 (2006)

    Article  Google Scholar 

  15. Combes, S. A., Rundle, D. E., Iwasaki, J. M. & Crall, J. D. Linking biomechanics and ecology through predator-prey interactions: flight performance of dragonflies and their prey. J. Exp. Biol. 215, 903–913 (2012)

    Article  CAS  Google Scholar 

  16. Azim, E., Jiang, J., Alstermark, B. & Jessell, T. M. Skilled reaching relies on a V2a propriospinal internal copy circuit. Nature 508, 357–363 (2014)

    Article  ADS  CAS  Google Scholar 

  17. Collett, T. S. & Land, M. F. How hoverflies compute interception courses. J. Comp. Physiol. A 125, 191–204 (1978)

    Article  Google Scholar 

  18. Haselsteiner, A. F., Gilbert, C. & Wang, Z. J. Tiger beetles pursue prey using a proportional control law with a delay of one half-stride. J. R. Soc. Interface 11, 20140216 (2014)

    Article  Google Scholar 

  19. Labhart, T. & Nilsson, D. E. The dorsal eye of the dragonfly sympetrum—specializations for prey detection against the blue sky. J. Comp. Physiol. A 176, 437–453 (1995)

    Article  Google Scholar 

  20. Tu, M. & Dickinson, M. Modulation of negative work output from a steering muscle of the blowfly Calliphora vicina. J. Exp. Biol. 192, 207–224 (1994)

    Article  CAS  Google Scholar 

  21. Todorov, E. Optimality principles in sensorimotor control. Nature Neurosci. 7, 907–915 (2004)

    Article  CAS  Google Scholar 

  22. Scholz, J. P. & Schoner, G. The uncontrolled manifold concept: identifying control variables for a functional task. Exp. Brain Res. 126, 289–306 (1999)

    Article  CAS  Google Scholar 

  23. Tucker, V. A., Tucker, A. E., Akers, K. & Enderson, J. H. Curved flight paths and sideways vision in peregrine falcons (Falco peregrinus). J. Exp. Biol. 203, 3755–3763 (2000)

    Article  CAS  Google Scholar 

  24. Ito, M. Cerebellar control of the vestibulo-ocular reflex–around the flocculus hypothesis. Annu. Rev. Neurosci. 5, 275–296 (1982)

    Article  CAS  Google Scholar 

  25. Poulet, J. F. & Hedwig, B. A corollary discharge maintains auditory sensitivity during sound production. Nature 418, 872–876 (2002)

    Article  ADS  CAS  Google Scholar 

  26. Wang, Z. J. & Russell, D. Effect of forewing and hindwing interactions on aerodynamic forces and power in hovering dragonfly flight. Phys. Rev. Lett. 99, 148101 (2007)

    Article  ADS  Google Scholar 

  27. Adelman, T. L., Bialek, W. & Olberg, R. M. The information content of receptive fields. Neuron 40, 823–833 (2003)

    Article  CAS  Google Scholar 

  28. von Reyn, C. R. et al. A spike-timing mechanism for action selection. Nature Neurosci. 17, 962–970 (2014)

    Article  CAS  Google Scholar 

  29. Thomas, S. J., Harrison, R. R., Leonardo, A. & Reynolds, M. S. A. Battery-free multichannel digital neural/EMG telemetry system for flying insects. IEEE Trans. Biomed. Circ. Sys. 6, 424–436 (2012)

    Article  Google Scholar 

  30. Hedrick, T. L. Software techniques for two- and three-dimensional kinematic measurements of biological and biomimetic systems. Bioinspir. Biomim. 3, 034001 (2008)

    Article  ADS  Google Scholar 

  31. Gorb, S. N. Evolution of the dragonfly head-arresting system. Proc. R. Soc. B 266, 525–535 (1999)

    Article  Google Scholar 

  32. Chang, L. Y. & Pollard, N. S. Constrained least-squares optimization for robust estimation of center of rotation. J. Biomech. 40, 1392–1400 (2007)

    Article  Google Scholar 

Download references


We thank I. Siwanowicz for advice on neck joint anatomy, and J. Melfi for assistance on kinematic computations. J. Osborne and J. Jordan provided assistance with the retroreflector assembly and artificial prey delivery systems. M. Barbic provided guidance on retroreflector mirroring. D. Parks and the Janelia vivarium provided dragonfly husbandry. We are grateful to V. Jayaraman, A. Karpova, W. Denk and J. Wang for discussions and comments on the manuscript. This work was supported by the Howard Hughes Medical Institute. Additional support to R.O. from AFOSR FA9550-10-1-0472.

Author information

Authors and Affiliations



A.L., M.M. and H.-T.L. designed the study. A.L. and R.O. designed the flight arena and Photron system. A.L. and H.-T.L. designed the motion-capture system. E.I. and P.H. collected the Photron data. H.-T.L. instrumented the artificial prey system and collected the motion-capture data. M.M. and H.-T.L. designed the pre-processing algorithms. M.M. analysed the data with input from A.L. A.L. and M.M. wrote the manuscript with input from H.-T.L. and R.O.

Corresponding author

Correspondence to Anthony Leonardo.

Ethics declarations

Competing interests

The authors declare no competing financial interests.

Extended data figures and tables

Extended Data Figure 1 Indoor flight arena enables recording of dragonfly foraging behaviour.

Flight arena is a 5.5 m × 4.3 m × 4.6 m room with controlled lighting (10 mW cm−2, 220 Hz illumination), heating (31 °C) and humidity (55%), comparable to outdoor summer conditions. Naturalistic images on the walls provide dragonflies with the optic flow needed for flight stability. ac, A raised platform (0.6 m × 0.6 m) (a), used by the dragonflies as a foraging perch, is placed at the centre of the room, within the filming volume of two high-speed cameras (b, Photron SA1, 1,000 fps, 150 μm camera residuals) and 18 infrared motion-capture cameras (c, Motion Analysis Corp. Raptor-4, 200 fps, 150 μm camera residuals). Insect prey (Drosophila virilis) are stocked in the arena and attracted to the platform by fly food dishes (not shown). d, An artificial prey delivery system is installed above the platform; this comprises a computer-controlled, height-adjustable pulley system (dashed yellow line) which moves a 2-mm retroreflective bead on transparent fishing line. A netted enclosure (1.5 m × 1.5 m × 2 m) keeps the instrumented dragonflies in the filming area during experiments.

Extended Data Figure 2 Classical reactive steering versus model-driven interception steering in the dragonfly.

a, Classical reactive steering: prey-image drift directly drives steering commands to maintain a desired strategy, such as parallel navigation (Fig. 1a). Dragonfly motion and prey motion alter prey angular velocity and position, thereby driving the next round of steering. b, Conceptual model summarizing the proposed control architecture used by the dragonfly—a predictive steering pathway (red) and a visually driven reactive steering pathway (blue) are needed to explain our results. Steering of the body is driven by prey position, prey angular velocity and the orientation of the body relative to the zenith (Fig. 3). We conjecture that these variables (the current prey state) are compared to a desired prey state—prey overhead, with the dragonfly body axis aligned to the prey’s flight path—and the state error is used to produce a wing command that yields movement of the dragonfly’s body. An inverse model may be used to transform the desired dragonfly state (in sensory coordinates) into a motor command; future studies will be needed to confirm this. Head steering to stabilize the prey in the fovea (Fig. 4d) is driven by prey angular velocity and an efference copy of the wing command signal. These variables yield a forward-model prediction of expected foveal drift, which is passed through an inverse model to generate compensatory head rotations (Figs 4c and 5). Accurate foveation can be achieved in the absence of unexpected prey motion (for example, blue trial, Fig. 4d), eliminating prey-image drift on the head and hence activity of the visual reactive pathway. In this regime, only the predictive pathway drives dragonfly steering. The predicted prey position can be inferred from the head–body angles (encoded proprioceptively, or estimated via efference copy of neck commands), whereas prey angular velocity must be available as an internal state variable. When prey-image drift on the retina detects unexpected prey motion, the reactive pathway introduces corrective steering of both body and head (Fig. 1e) and updates the estimate of prey angular velocity.

Extended Data Figure 3 Definition of anatomically based head and body reference frames.

a, Position of the retroreflective markers is mapped to stereotypical anatomical features, which are used along with the numerically estimated head joint to define anatomical reference frames. b, Anatomical features include three points on the thorax (leftmost and rightmost tips of dorsal carina of episternum, painted in red for ease of identification, and centre of hind wing scotellum) and three points on the back of the head (upper tip of midline and two laterally symmetrical points, for example, the tips of yellow coloured bands underneath the head markers). c, A preliminary body frame is defined based on the triangle formed by the three anatomical points on the thorax, with its origin at the rear vertex of the triangle. From this vertex, the roll axis (xB) is defined as the direction to the midpoint between the other two vertices, the yaw axis (zB) as the direction orthogonal to the triangle, and the pitch axis (yB) as the direction orthogonal to the other two. A preliminary head frame, with origin at the head joint, is defined based on the head anatomical features. From the joint, the yaw axis (zH) is the direction to the top anatomical point. The pitch axis (yH) is the direction pointing from the right to the left anatomical points on the head, orthogonalized relative to the yaw axis. Finally, the roll axis (xH) is orthogonal to the other two. In the schematic, the black solid contour on the head denotes its dorsal part, defined by this head frame. d, The final body frame is obtained applying a 25° pitch correction to the preliminary one, to compensate for the natural upward slope of the anatomical triangle. The corrected body axis roughly corresponds to the thorax–abdomen line (Fig. 2c). The final (‘foveal’) head frame is instead obtained by applying a pitch correction to the preliminary one so that the roll axis matches the average prey-image elevation during pursuit. This provides an estimate for the centre of the high-acuity region of the eye, and varies slightly across animals (38° average).

Extended Data Figure 4 Analytical decomposition of prey-image drift on the head.

The angular position of the prey image on the head is obtained by projecting the range vector (r) on the head reference frame (xH, yH, zH); we define the azimuthal position as θprey, azim and the elevational position as θprey, elev. Prey-image drift (‘drift’, with rates ) can be induced by changes in the orientation of the head, and by changes in the direction of the range vector (caused by translation of the prey relative to the dragonfly). Changes in head orientation can be quantified as the sum of the absolute angular velocity of the body (ωB, abs, ‘body rotation’) and the relative angular velocity of the head with respect to the body (ωH, rel, ‘head rotation’). The contribution of relative translation to the drift rate equals the difference between dragonfly (vdf) and prey (vprey) velocities, scaled by the distance to the prey (). The relevant dragonfly velocity for foveation is that at the origin of the head frame; for graphical clarity, we have shown the velocity at the origin of the body reference frame (xB, yB, zB) instead. The notation ab stands for the projection of vector a in the direction of vector b. Prey movement in the head frame (xH, yH, zH) (Fig. 4) will generally produce motion simultaneously along all three axes. We analysed foveation within the coordinate system defined by () (Fig. 5) because it decouples the directions functionally relevant for foveation. Rotations about produce purely elevational drift, whereas rotations about produce purely azimuthal drift. Translations in the range vector direction and rotations about that axis only change the angular size and orientation of the prey on the eye but cause no drift in position.

Extended Data Figure 5 Dragonfly head movements compensate for both dragonfly translation and prey translation.

af, Scatter plots of residual head rotation against individual prey and dragonfly translational components, and against their sum (21 constant-speed artificial prey flights). Cancellation of dragonfly translation alone (a, azimuth: slope m = −0.38 ± 0.07, r2 = 0.03; b, elevation: m = −0.21 ± 0.04, r2 = 0.02) and prey translation alone (c, azimuth: m = −0.60 ± 0.04, r2 = 0.14; d, elevation: m = −0.02 ± 0.04, r2 < 0.01) is largely inferior to the cancellation of the relative prey translation (e, azimuth: m = −1.17 ± 0.05, r2 = 0.38; f, elevation: m = −1.05 ± 0.08, r2 = 0.14).

Supplementary information

Interception flight to Drosophila with successful capture.

Filmed at 1000 fps (1/2000 shutter), replayed at 20x slower speed. (MOV 5528 kb)

Interception flight to Drosophila with unsuccessful capture

Filmed at 1000 fps (1/2000 shutter), replayed at 20x slower speed. (MOV 9170 kb)

Interception flight to artificial prey, side view

Dragonfly is loaded with head and body markers. The pre-takeoff saccadic head motions foveate the prey, and are used to define the neck joint (see Methods). Filmed at 1000 fps, replayed at 20x slower speed. (MOV 10352 kb)

Interception flight to artificial prey, top view

Prey starts at middle top of image and moves right and down before being captured. Filmed at 250fps and replayed at 40x slower speed. (MP4 17295 kb)

Head and body steering during interception flight with an orienting turn.

Dragonfly is shown in orange, body axis in blue, direction of gaze (head roll axis) in green. The prey is shown in gray. Black lines show the range vector between dragonfly and prey in 50ms steps and at capture. During the body orienting turn (t = ~200 ms), foveation is maintained despite apparent prey image drift of up to 1000°/s (green and black lines are aligned). At t = ~300 ms, the turn is completed and dragonfly is directly below the prey with its body axis and flight path locked to the prey’s direction of motion. Data from Figure 2d. (MOV 18199 kb)

Head and body steering during interception of prey with a speed jump

The black circle represents the virtual position of the prey had its speed remained constant; the dashed black line is the range vector from the dragonfly to this virtual position. The prey speed changes about 200ms after takeoff, causing foveation to be lost and then recovered. No sharp course corrections are seen in the dragonfly flight path, suggesting they are subtle gestures integrated into ongoing steering. Data from Figure 2e. (MOV 821 kb)

Interaction of head, body, and relative prey motion on foveation

Sequential animation showing apparent prey image drift induced by each component and then their summation. (5 ms time steps; see Figure 4, blue traces). (MOV 6915 kb)

Head movements predictively compensate for summed foveation disturbances

Actual prey trajectory on the head compared with that from the summed foveation disturbances (body rotation and relative prey translation) and head rotations, for a constant speed artificial prey interception (5 ms time steps; see Figure 4, blue traces). (MOV 516 kb)

: Head movements predictively compensate for foveation disturbances from relative prey translation

Actual prey trajectory on the head compared to that from relative prey translation and residual head rotations for a constant speed artificial prey interception (5 ms time steps; see Figure 4, blue traces). (MOV 471 kb)

PowerPoint slides

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Mischiati, M., Lin, HT., Herold, P. et al. Internal models direct dragonfly interception steering. Nature 517, 333–338 (2015).

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI:

This article is cited by


By submitting a comment you agree to abide by our Terms and Community Guidelines. If you find something abusive or that does not comply with our terms or guidelines please flag it as inappropriate.


Quick links

Nature Briefing

Sign up for the Nature Briefing newsletter — what matters in science, free to your inbox daily.

Get the most important science stories of the day, free in your inbox. Sign up for Nature Briefing