Brief Communication | Published:

Dexterous robotic manipulation of alert adult Drosophila for high-content experimentation

Nature Methods volume 12, pages 657660 (2015) | Download Citation

Abstract

We present a robot that enables high-content studies of alert adult Drosophila by combining operations including gentle picking; translations and rotations; characterizations of fly phenotypes and behaviors; microdissection; or release. To illustrate, we assessed fly morphology, tracked odor-evoked locomotion, sorted flies by sex, and dissected the cuticle to image neural activity. The robot's tireless capacity for precise manipulations enables a scalable platform for screening flies' complex attributes and behavioral patterns.

Access optionsAccess options

Rent or Buy article

Get time limited or full article access on ReadCube.

from$8.99

All prices are NET prices.

References

  1. 1.

    & Nat. Methods 8, 599–605 (2011).

  2. 2.

    et al. Nature 450, 63–70 (2007).

  3. 3.

    , & Annu. Rev. Biomed. Eng. 13, 185–217 (2011).

  4. 4.

    et al. Nat. Methods 8, 171–176 (2011).

  5. 5.

    et al. Nat. Methods 9, 977–980 (2012).

  6. 6.

    , , , & Nat. Methods 10, 64–67 (2013).

  7. 7.

    , , , & Nat. Methods 6, 297–303 (2009).

  8. 8.

    , , , & Nat. Methods 6, 451–457 (2009).

  9. 9.

    , , & J. R. Soc. Interface 8, 395–409 (2011).

  10. 10.

    & IEEE Robot. Autom. Mag. 13, 82–90 (2006).

  11. 11.

    , , & Int. J. Rob. Res. 24, 743–753 (2005).

  12. 12.

    , , , & eLife 2, e00231 (2013).

  13. 13.

    & J. Exp. Biol. 199, 2085–2104 (1996).

  14. 14.

    , , , & J. Exp. Biol. 207, 3515–3522 (2004).

  15. 15.

    J. Insect Physiol. 46, 439–442 (2000).

  16. 16.

    & Annu. Rev. Entomol. 34, 97–116 (1989).

  17. 17.

    et al. J. Therm. Biol. 23, 291–299 (1998).

  18. 18.

    et al. Nat. Methods 7, 535–540 (2010).

  19. 19.

    , & Nat. Neurosci. 13, 393–399 (2010).

  20. 20.

    , & J. Neurosci. 31, 11772–11785 (2011).

  21. 21.

    , & Science 303, 366–370 (2004).

  22. 22.

    , , , & Neuron 70, 1165–1177 (2011).

  23. 23.

    , & IEEE Trans. Image Process. 7, 27–41 (1998).

Download references

Acknowledgements

We thank T.R. Clandinin, L. Liang, L. Luo, G. Dietzl, S. Sinha and T.M. Baer for advice; A. Janjua and J. Li for technical assistance; and D. Petrov and A. Bergland for providing inbred fly lines. We thank the W.M. Keck Foundation, the Stanford Bio-X program, a US National Institutes of Health Director's Pioneer Award for research funding (M.J.S.), and the Stanford-NIBIB Training program in Biomedical Imaging Instrumentation (J.R.M.).

Author information

Author notes

    • Joan Savall
    •  & Eric Tatt Wei Ho

    These authors contributed equally to this work.

Affiliations

  1. James H. Clark Center, Stanford University, Stanford, California, USA.

    • Joan Savall
    • , Eric Tatt Wei Ho
    • , Cheng Huang
    • , Jessica R Maxey
    •  & Mark J Schnitzer
  2. Howard Hughes Medical Institute, Stanford University, Stanford, California, USA.

    • Joan Savall
    •  & Mark J Schnitzer
  3. CNC Program, Stanford University, Stanford, California, USA.

    • Joan Savall
    • , Jessica R Maxey
    •  & Mark J Schnitzer
  4. Centre for Intelligent Signal and Imaging Research, Universiti Teknologi Petronas, Perak, Malaysia.

    • Eric Tatt Wei Ho

Authors

  1. Search for Joan Savall in:

  2. Search for Eric Tatt Wei Ho in:

  3. Search for Cheng Huang in:

  4. Search for Jessica R Maxey in:

  5. Search for Mark J Schnitzer in:

Contributions

J.S. and E.T.W.H. contributed equally and innovated all hardware and software for the robotic system and performed all experiments. C.H. performed and analyzed the odor-evoked trackball experiments. J.R.M. created and performed the machine vision analyses of head and body areas. M.J.S. conceived of and supervised the research program. All authors wrote the paper.

Competing interests

The authors declare no competing financial interests.

Corresponding authors

Correspondence to Joan Savall or Mark J Schnitzer.

Integrated supplementary information

Supplementary information

PDF files

  1. 1.

    Supplementary Text and Figures

    Supplementary Figures 1–14 and Supplementary Tables 1 and 2

Videos

  1. 1.

    The robot tracks and captures a fly under infrared illumination using real-time machine vision guidance.

    Automated tracking and picking of an active fly from the picking platform, played back at real speed. Video 2 shows the same events in slow motion.

  2. 2.

    Slow motion video of the robot tracking and picking a fly.

    The same events as in Video 1, but played back at 20 × slower speed. To search for the ring reflection pattern on the fly thorax while tracking the fly, the robot turns on the ring of infrared LEDs before acquiring an image via the onboard camera. The robot picks up the fly by touching the picking effector to the reflected ring target on the thorax.

  3. 3.

    Flies emerging onto the picking platform.

    The picking platform is a key tool to enhance the throughput of automated handling. Alert flies that have never been anesthetized can rapidly populate the platform. These flies emerge through an opening in the center of the platform from a standard vial attached to the platform's underside.

  4. 4.

    The robot captures three flies and transfers them to head holders.

    After picking each fly, the robot carried it to an inspection camera and obtained the location of the neck apse. The robot used this information to align and tether the head of the fly to the holder. Since the head holders were based on suction, we released all three flies after the experiment.

  5. 5.

    High-magnification inspection of a picked fly.

    The robot rotates a fly over 360° for high-magnification inspection at various yaw angles.

  6. 6.

    Rapid robotic manipulation of alert flies.

    The robot rapidly transfers flies back and forth between the halves of a divided platform. This demonstration illustrates high-speed handling of flies.

  7. 7.

    The robot rapidly picked and released flies.

    The robot can pick and release individual flies multiple times without harming them.

  8. 8.

    The robot discriminated flies by sex with 99% accuracy.

    To perform the discrimination, the robot picked individual flies and brought them to a high-magnification inspection camera. The system puffed air beneath the picked fly to induce flight, so that the wings did not occlude the abdomen. A robotic algorithm examined the fly as it was rotated, to find the best view of the abdomen, and then determined the fly's sex using an image of the abdomen (Supplementary Fig. 5 and Supplementary Table 2).

  9. 9.

    The robot holds a fly as it walks on an air-suspended ball for studies of odor-evoked locomotion.

    We recorded the forward, lateral, and rotational components of the locomotor patterns in response to odor stimuli, which we delivered through the pipette directed toward the fly's head.

  10. 10.

    Mechanical microsurgery of the fly cuticle, to expose the brain for in vivo imaging.

    Using a three-dimensional translation stage, we maneuvered the head-fixed fly to the end mill and made an initial cut to open the cuticle. Thereafter, we commanded the stage to continue cutting along a preprogrammed trajectory. Saline immersion kept the brain hydrated and prevented tissue debris from interfering with the surgery.

About this article

Publication history

Received

Accepted

Published

DOI

https://doi.org/10.1038/nmeth.3410

Further reading