Dexterous robotic manipulation of alert adult Drosophila for high-content experimentation

Abstract

We present a robot that enables high-content studies of alert adult Drosophila by combining operations including gentle picking; translations and rotations; characterizations of fly phenotypes and behaviors; microdissection; or release. To illustrate, we assessed fly morphology, tracked odor-evoked locomotion, sorted flies by sex, and dissected the cuticle to image neural activity. The robot's tireless capacity for precise manipulations enables a scalable platform for screening flies' complex attributes and behavioral patterns.

Access options

Rent or Buy article

Get time limited or full article access on ReadCube.

from$8.99

All prices are NET prices.

Figure 1: A high-speed robot that uses real-time machine vision guidance to identify, capture and manipulate nonanesthetized adult flies.
Figure 2: Machine vision–based assessments of fly phenotypes.
Figure 3: Automated assessments of sensory-evoked behavioral responses, programmable microsurgery and two-photon imaging of olfactory neural dynamics.

References

  1. 1

    Albrecht, D.R. & Bargmann, C.I. Nat. Methods 8, 599–605 (2011).

    CAS  Article  Google Scholar 

  2. 2

    Chalasani, S.H. et al. Nature 450, 63–70 (2007).

    CAS  Article  Google Scholar 

  3. 3

    Yanik, M.F., Rohde, C.B. & Pardo-Martin, C. Annu. Rev. Biomed. Eng. 13, 185–217 (2011).

    CAS  Article  Google Scholar 

  4. 4

    Chung, K. et al. Nat. Methods 8, 171–176 (2011).

    CAS  Article  Google Scholar 

  5. 5

    Crane, M.M. et al. Nat. Methods 9, 977–980 (2012).

    CAS  Article  Google Scholar 

  6. 6

    Kabra, M., Robie, A.A., Rivera-Alba, M., Branson, S. & Branson, K. Nat. Methods 10, 64–67 (2013).

    CAS  Article  Google Scholar 

  7. 7

    Dankert, H., Wang, L., Hoopfer, E.D., Anderson, D.J. & Perona, P. Nat. Methods 6, 297–303 (2009).

    CAS  Article  Google Scholar 

  8. 8

    Branson, K., Robie, A.A., Bender, J., Perona, P. & Dickinson, M.H. Nat. Methods 6, 451–457 (2009).

    CAS  Article  Google Scholar 

  9. 9

    Straw, A.D., Branson, K., Neumann, T.R. & Dickinson, M.H. J. R. Soc. Interface 8, 395–409 (2011).

    Article  Google Scholar 

  10. 10

    Chaumette, F. & Hutchinson, S. IEEE Robot. Autom. Mag. 13, 82–90 (2006).

    Article  Google Scholar 

  11. 11

    Higashimori, M., Kaneko, M., Namiki, A. & Ishikawa, M. Int. J. Rob. Res. 24, 743–753 (2005).

    Article  Google Scholar 

  12. 12

    Mendes, C.S., Bartos, I., Akay, T., Márka, S. & Mann, R.S. eLife 2, e00231 (2013).

    Article  Google Scholar 

  13. 13

    Dickinson, M.H. & Götz, K.G. J. Exp. Biol. 199, 2085–2104 (1996).

    CAS  PubMed  Google Scholar 

  14. 14

    Zumstein, N., Forman, O., Nongthomba, U., Sparrow, J.C. & Elliott, C.J. J. Exp. Biol. 207, 3515–3522 (2004).

    Article  Google Scholar 

  15. 15

    Barron, A.B. J. Insect Physiol. 46, 439–442 (2000).

    CAS  Article  Google Scholar 

  16. 16

    Nicolas, G. & Sillans, D. Annu. Rev. Entomol. 34, 97–116 (1989).

    CAS  Article  Google Scholar 

  17. 17

    Jean David, R. et al. J. Therm. Biol. 23, 291–299 (1998).

    Article  Google Scholar 

  18. 18

    Seelig, J.D. et al. Nat. Methods 7, 535–540 (2010).

    CAS  Article  Google Scholar 

  19. 19

    Maimon, G., Straw, A.D. & Dickinson, M.H. Nat. Neurosci. 13, 393–399 (2010).

    CAS  Article  Google Scholar 

  20. 20

    Honegger, K.S., Campbell, R.A. & Turner, G.C. J. Neurosci. 31, 11772–11785 (2011).

    CAS  Article  Google Scholar 

  21. 21

    Wilson, R.I., Turner, G.C. & Laurent, G. Science 303, 366–370 (2004).

    CAS  Article  Google Scholar 

  22. 22

    Clark, D.A., Bursztyn, L., Horowitz, M.A., Schnitzer, M.J. & Clandinin, T.R. Neuron 70, 1165–1177 (2011).

    CAS  Article  Google Scholar 

  23. 23

    Thévenaz, P., Ruttimann, U.E. & Unser, M. IEEE Trans. Image Process. 7, 27–41 (1998).

    Article  Google Scholar 

Download references

Acknowledgements

We thank T.R. Clandinin, L. Liang, L. Luo, G. Dietzl, S. Sinha and T.M. Baer for advice; A. Janjua and J. Li for technical assistance; and D. Petrov and A. Bergland for providing inbred fly lines. We thank the W.M. Keck Foundation, the Stanford Bio-X program, a US National Institutes of Health Director's Pioneer Award for research funding (M.J.S.), and the Stanford-NIBIB Training program in Biomedical Imaging Instrumentation (J.R.M.).

Author information

Affiliations

Authors

Contributions

J.S. and E.T.W.H. contributed equally and innovated all hardware and software for the robotic system and performed all experiments. C.H. performed and analyzed the odor-evoked trackball experiments. J.R.M. created and performed the machine vision analyses of head and body areas. M.J.S. conceived of and supervised the research program. All authors wrote the paper.

Corresponding authors

Correspondence to Joan Savall or Mark J Schnitzer.

Ethics declarations

Competing interests

The authors declare no competing financial interests.

Integrated supplementary information

Supplementary Figure 1 Photograph of the fly-handling robot.

A photograph of the robot, which has a similar footprint as a laptop computer. Scale bar is 2.5 cm.

Supplementary Figure 2 Signal flow for components of the robotic system.

Via USB communications and a microcontroller interface, a single computer controls the robot, four cameras (onboard, localization, inspection and microsurgery), the fly picking platform, and the inspection and microsurgery stations.

Supplementary Figure 3 (a) Software control flow for picking and head-fixing a fly.

Schematics of the main computer algorithm that directs all actions of the robotic system. The software uses information obtained from different readouts, such as pressure readings from the picking effector and the results from image analysis algorithms, to determine the next step in the fly picking and handling process.(b) Illustration of the basic steps involved in picking and then tethering a fly.

Supplementary Figure 4 A custom-designed loading chamber facilitates rapid entry of flies onto a picking stage.

(a) Photograph of a fly climbing onto the picking stage, acquired under infrared lighting. The other flies have reached the stage just moments before. (See also Video 3). Scale bar is 1 mm.(b) Diagram of how the chamber works. After attachment of a standard vial of flies to the bottom of the platform, flies climb upward through the inner chamber due to negative geotaxis (Step 1). Flies emerge rapidly at the top (Step 2) (Video 3). Upon picking an individual fly, the robot has the option of releasing the fly into one of several sorting chutes (Step 3). These chutes can optionally route to separate chambers, as needed (Step 4). Scale bar is 5 mm.(c) Photograph showing the robot holding a fly, moments after it has nabbed the fly from the picking stage. Scale bar is 5 mm.

Supplementary Figure 5 Basic steps of the image analysis algorithm that localizes flies on the picking platform.

This algorithm uses image data from the localization camera and extracts the approximate positions of all flies on the picking platform.(1) We first acquire a reference image, IR(x, y), of the platform without any flies. Later, with flies on the platform we acquire another image, IF(x, y). The background illumination of the fly arena is non-uniform, so we binarize the image data by taking into account the variations in background intensity across the image:where the parameter b is chosen within (0.08–0.12), depending on details of the illumination conditions.(2) For all pixels with binarized values of one, we transform their position coordinates from those of the camera pixels, (x, y), to coordinates in real space, (X, Y), by using a homographic matrix, H:The homographic matrix, H, defines a linear transform that maps a two-dimensional plane with coordinate system (x, y) onto a second plane with coordinate system (X, Y)To solve for H, it suffices to obtain five points in the two planes that correspond to each other. We do this in a pre-calibration step in which we find (Xi, Yi) that correspond to (xi, yi) for i = 1, 2,…, 5.(3) To identify individual flies, we apply a region-growing clustering algorithm to the binarized pixels using their coordinates in real space. The algorithm examines each pixel with an 8-connected neighborhood and grows a cluster only if the pixel being examined is adjacent to ≥2 unity-valued pixels. We designate candidate flies as those clusters with >50 pixels.(4) The algorithm computes the approximate position of each candidate fly by taking the spatial centroid of all pixels within its cluster.

Supplementary Figure 6 Basic steps of the algorithm that detects the ring reflection on the fly thorax, to target flies for picking.

This algorithm analyzes images obtained from the onboard camera during fly tracking to determine the centroid of the ring reflection, if present, and to estimate the fly's orientation. The robot first acquires an image with only the stationary infrared illumination coming from beneath the picking platform; in this image the fly is dark. If there is a fly in the first image, the robot acquires a second image with its onboard ring of LEDs turned on, which provides much brighter illumination than the stationary illuminator. From these two images, the algorithm extracts the ring reflection and the fly's orientation:(1) To the first image the algorithm applies a rank filter, which replaces each pixel's intensity with the maximum intensity found within a local 12 × 12 window of pixels. This removes the grid pattern of the picking platform's wire mesh but retains pixels of the fly. Then, the algorithm binarizes the filtered image by selecting candidate fly pixels as all those with intensity less than a maximum threshold value. These binarized pixels are clustered with a region-growing algorithm similar to that of Step 3 of the localization algorithm in Supplementary Fig. 5. Clusters with >2880 pixels are designated as individual flies. If ≥2 flies are present, the computer selects the fly closest to the image center and computes its centroid (x0q, y0q).(2) We define (xq, yq) as pixels belonging to the selected fly. This cluster consists of Nq pixels. The algorithm estimates the orientation of the selected fly from the covariance matrix:The computer computes the eigenvalues, λ1 and λ2, and the eigenvectors, R, of matrix CBy ordering the equation above with λ1 as the largest eigenvalue, the algorithm computes the orientation of the fly, θ, from the matrix, R, using the fact that R is also a rotation matrix.(3),(4) To locate the center of the ring reflection, the algorithm compares an annular template (Fig. 1b) of 33 × 33 pixels, W(x, y), to the local image subfield of equal size centered at each of the pixels, (xq, yq), within a fly's cluster of pixels. For each such comparison, we binarize the image subfield by using its 80th percentile intensity as a threshold. The inner product of the resultant binary window,T(x, y, i, j), with the template W(x, y) yields a comparison score(5) We designate all pixels with V(x, y) > 200 as candidate locations for the ring centroid. The algorithm takes the pixel with the highest score as the target on the fly thorax.

Supplementary Figure 7 Algorithm to sort flies by sex.

Flowchart of the robot's protocol for determination of a fly's sex:(1) To induce the fly into flight, the robot brings it into a stream of air that is pulsating (2 Hz) on and off. The pulsations reliably induce flight, preventing the wings from occluding the inspection camera's view of the abdomen.(2) An image-analysis algorithm segments the fly abdomen in an image acquired by the inspection camera. The camera position and infrared illumination are such that the picked fly appears darker than the surrounding bright background (Fig. 2a).(3) To estimate the vectorial orientation of the segmented abdomen, the algorithm computes the slope of the line connecting the center of the picking effector to the posterior edge of the abdomen. The algorithm locates the fly posterior at the segmented pixel furthest away from the picking effector, in the vectorial direction of the principal component of all abdomen pixels.(4) The computer stores the value of the abdomen's orientation and then rotates the fly by 5°.(5) The algorithm returns to Step 2 until the robot has examined the fly across a full 360°.(6) The robot rotates the fly to the yaw angle that maximized the apparent vertical extent of the abdomen. This view gives the clearest view of the bands and segments on the abdomen.(7) The algorithm obtains 100 equally spaced samples of the pixel intensity profile along the line that connects the posterior of the abdomen to the center of the picking effector.(8) To ascertain the sex, the algorithm checks for >2 dark bands (indicative of a fly abdomen) and a dark-colored posterior segment (indicative of a male) (Fig. 2a). The algorithm identifies dark bands as local minima in the intensity profile found in Step 7 that are flanked by regions of sharply decreasing contrast from surrounding local maxima. To identify a male's dark posterior segment, the algorithm computes the integral and median ratiosand applies the following classification rule:Using this algorithm, the classification achieved 99 ± 1% (s.d.) accuracy on N = 108 flies (Supplementary Table 2) within ~20 ms of computation time per fly. The total time to pick and sort each fly was ~20 s.

Supplementary Figure 8 Image processing steps for segmentation of the fly head and body.

This analysis yields the cross-sectional areas of the fly's head and body:Step 1 applies a color threshold to isolate the fly from the background. If the green channel of a pixel is greater than the red channel plus ten, the pixel becomes part of the background. The algorithm also discards any background columns that do not contain useful information about the fly.Step 2 uses a Hough transform to locate and then remove the straight edges of the picking effector. The Hough transform is useful for locating the effector, as the fly does not contain long, straight lines within its body or head. As the effector can be slightly rotated, the transform searches for straight lines within an angular range of ±5°. The peak angular value yielded by the Hough transform corresponds to the angle of the two lines that outline the effector. All pixels within the bounds of these two lines then become part of the background.Step 3 crops off the legs and wings by removing all image regions less than a minimum width of 75 pixels. After binarizing the image, the algorithm uses the distance between the two outer-most foreground pixels in each row and column as the metric to compare to the 75-pixel threshold. However, because the vertical extent of the fly's neck can be less than the threshold, it omits columns likely to contain the fly's head.Step 4 traces the outline of the fly. The algorithm binarizes the image and performs erosion with a 2 × 2 square matrix, deleting any image pixels that do not fully overlap with the square matrix as it is moved across the image. We perform two such erosions and then subtract the resulting eroded version from the original binary image. The largest connected object in the resulting image corresponds to the outline of the fly. If internal contours exist, the algorithm keeps them and uses these contours to exclude background pixels from the final area calculations. (For instance, if the fly's head is tilted downward, a gap appears between the bottom of the fly's neck and the point where the head touches the abdomen).Step 5 finds the steepest minima of the head contour to the right of the picking effector, then searches for a similar maxima peak within a bounded region below the top minima. These points correspond to the top and bottom of the boundary separating the fly's head from its body.Step 6 uses these head boundary points to segment and uniformly fill the fly's head. To create a boundary for the flood-fill algorithm (Step 7), a line is drawn between the two points found in Step 5. If one of these points falls on an internal contour, the top and bottom pixels of the internal contour are connected to the nearest fly outline pixels. This step effectively removes any extraneous background pixels from the head area. The algorithm finds the number of pixels in the cross-sectional area of the head by counting the number of non-background pixels in the resulting, flood-filled head image. It then subtracts the head region from the binarized image, leaving only the pixels in the body for inclusion in the cross-sectional body area count.Step 7 displays the final head (green) and body (blue) segmentations atop the original image.

Supplementary Figure 9 Robotic protocol to identify the apse of the fly's neck.

Flowchart of the robot's protocol for finding the apse of the fly's neck:(1) The robot uses the fly's yaw angle, as determined from the picking algorithm (Supplementary Fig. 6), and rotates the fly to the nominal yaw orientation in which its head directly faces the inspection camera.(2) Under illumination from infrared LEDs facing the fly's anterior, the fly head appears brighter than the rest of the body in the image from the inspection camera. After thresholding to zero all pixels below a minimum intensity, the resulting image shows only the fly head.(3) An image analysis algorithm quantifies the symmetry of the fly head in the resulting image. To do this, the algorithm fits an ellipse to the outline of the fly head and uses the ellipse's minor axis as an approximate axis of symmetry. The algorithm then computes the degree of symmetry about this axis as the mean absolute difference between the intensities of pixels at mirror symmetric locations.(4) The robot rotates the fly by a 1° yaw increment. If this is the first iteration through Steps 2–5, the algorithm first determines the rotational direction that will increase symmetry by examining which side of the head appears larger in the silhouette image.(5) The robot returns to Step 2, unless the symmetry of the head has decreased over two consecutive iterations, indicating that the fly has passed the yaw angle of optimal symmetry.(6) The robot rotates the fly to an orientation 90° from the one found to maximize the symmetry in the iterative loop of Steps 2–5. This provides the inspection camera with a side view.(7) To reliably extract the silhouette of the neck apse, the algorithm acquires and averages 20 consecutive images of the fly from the inspection camera.(8) The algorithm binarizes the mean image by applying an intensity threshold, thereby obtaining a silhouette of the fly head and thorax.(9) The algorithm locates the fly neck by identifying the intersection of the head and thorax. This is the intersection between the longest horizontal line and the shortest vertical line that can be fitted inside the silhouette.(10) The algorithm finds the apse of the neck by looking for the apex of the silhouette below the neck.For further details regarding steps (7–10) see Supplementary Fig. 10.

Supplementary Figure 10 Algorithm for finding the apse of the fly neck.

The inspection camera acquires 20 consecutive images of the fly from a side view. If the fly is moving its head and forelegs the view of its neck apse may be obscured. The algorithm thresholds the average of the 20 images to obtain a silhouette of the fly head and thorax; this reliably reveals the neck apse (red arrows). To locate the neck apse, the algorithm determines the neck's centroid as the intersection of the longest horizontal and shortest vertical lines within the silhouette (Inset, top row). The algorithm then searches for an apex in the silhouette under the centroid (Inset, bottom row).

Supplementary Figure 11 For additional manipulations or streamlined experimentation, the robot can tether alert flies onto several types of holders.

(a) The robot can tether flies to a head restraint, illustrated by a fly attached by the apse of its neck to a 125-μm-diameter glass fiber. Prior to head-fixation, the robot automatically aligns the picked fly to the head holder. Scale bar is 0.5 mm.(b) Array of tethers laser-machined in a flexible acetate film, for brain imaging or phenotyping of many head-fixed flies. Scale bar is 3 mm.(c) A detachable head-holder uses suction to hold the fly by its proboscis. Scale bar is 0.5 mm. Video 4 shows the robot placing flies into the detachable suction holder.

Supplementary Figure 12 Automated determinations of odor-evoked locomotor patterns.

(a) A compact trackball system used by the robot for studies of sensory-evoked locomotor behavior. Two optical sensors track the ball's two rotational degrees of freedom. Scale bar is 20 mm.(b) The picking robot can precisely position a fly onto the trackball. Scale bar is 20 mm.(c, d) Delivery of the aversive odor benzaldehyde (red shaded areas, bottom panels) evoked forward, sideways and rotational locomotor responses in manually glued, c, and robotically handled, d, flies. Scale bars are 0.5 mm.(e) Odor-evoked walking speeds (mean ± s.e.m.). There were no significant speed differences between flies handled manually or prepared by the robot for forward (P = 0.12), sideways (P = 0.66), or rotational (P = 0.28) motions (n = 4 flies in each group; n = 10 trials per fly; Wilcoxon rank sum test).(f) Odor-evoked locomotor speeds (mean ± s.e.m.) for 8 individual flies that were manually glued (gray: flies 1–4) or handled robotically (blue: flies 5–8). Each fly received 10 trials of odor stimulation.

Supplementary Figure 13 The robot has flexible capabilities for automating a variety of manipulations.

(a) Robotic micro-drilling using a 25-μm-diameter end-mill creates small holes in the cuticle for electrophysiological recordings, fiber-optic light delivery, or microinjections.(b) Microinjection of fluids into the fly's head capsule through a 50-μm-diameter opening that the robot had milled in the cuticle using a borosilicate glass capillary pulled to a 40-μm-diameter tip.(c) Delivery of blue light, e.g. for optogenetics, from an etched optical fiber (550-μm-diameter) through a 100-μm-diameter opening that the robot had created in the cuticle.Scale bars in panels a, b and c are 0.25 mm.(d) The robotic system holds the fly detachably by suction during brain imaging. Ethyl acetate odor is delivered while the fly's odor-evoked neural responses are imaged by two-photon microscopy through a 150-μm-diameter opening that the robot had created in the cuticle. Scale bar is 3 mm. (See also Fig. 3e).

Supplementary Figure 14 Future versions of the robot can add translational or rotational degrees of freedom and can include additional picking units.

Supplementary information

Supplementary Text and Figures

Supplementary Figures 1–14 and Supplementary Tables 1 and 2 (PDF 14127 kb)

The robot tracks and captures a fly under infrared illumination using real-time machine vision guidance.

Automated tracking and picking of an active fly from the picking platform, played back at real speed. Video 2 shows the same events in slow motion. (MOV 1206 kb)

Slow motion video of the robot tracking and picking a fly.

The same events as in Video 1, but played back at 20 × slower speed. To search for the ring reflection pattern on the fly thorax while tracking the fly, the robot turns on the ring of infrared LEDs before acquiring an image via the onboard camera. The robot picks up the fly by touching the picking effector to the reflected ring target on the thorax. (MOV 14165 kb)

Flies emerging onto the picking platform.

The picking platform is a key tool to enhance the throughput of automated handling. Alert flies that have never been anesthetized can rapidly populate the platform. These flies emerge through an opening in the center of the platform from a standard vial attached to the platform's underside. (MOV 21647 kb)

The robot captures three flies and transfers them to head holders.

After picking each fly, the robot carried it to an inspection camera and obtained the location of the neck apse. The robot used this information to align and tether the head of the fly to the holder. Since the head holders were based on suction, we released all three flies after the experiment. (MOV 33736 kb)

High-magnification inspection of a picked fly.

The robot rotates a fly over 360° for high-magnification inspection at various yaw angles. (MOV 8206 kb)

Rapid robotic manipulation of alert flies.

The robot rapidly transfers flies back and forth between the halves of a divided platform. This demonstration illustrates high-speed handling of flies. (MOV 15784 kb)

The robot rapidly picked and released flies.

The robot can pick and release individual flies multiple times without harming them. (MOV 14020 kb)

The robot discriminated flies by sex with 99% accuracy.

To perform the discrimination, the robot picked individual flies and brought them to a high-magnification inspection camera. The system puffed air beneath the picked fly to induce flight, so that the wings did not occlude the abdomen. A robotic algorithm examined the fly as it was rotated, to find the best view of the abdomen, and then determined the fly's sex using an image of the abdomen (Supplementary Fig. 5 and Supplementary Table 2). (MOV 12149 kb)

The robot holds a fly as it walks on an air-suspended ball for studies of odor-evoked locomotion.

We recorded the forward, lateral, and rotational components of the locomotor patterns in response to odor stimuli, which we delivered through the pipette directed toward the fly's head. (MOV 15092 kb)

Mechanical microsurgery of the fly cuticle, to expose the brain for in vivo imaging.

Using a three-dimensional translation stage, we maneuvered the head-fixed fly to the end mill and made an initial cut to open the cuticle. Thereafter, we commanded the stage to continue cutting along a preprogrammed trajectory. Saline immersion kept the brain hydrated and prevented tissue debris from interfering with the surgery. (MOV 6494 kb)

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Savall, J., Ho, E., Huang, C. et al. Dexterous robotic manipulation of alert adult Drosophila for high-content experimentation. Nat Methods 12, 657–660 (2015). https://doi.org/10.1038/nmeth.3410

Download citation

Further reading