Pereira, T.D. et al. Nat Method 16, 117–125 (2019)

Need a way to track an animal in motion? Take a LEAP, short for ‘LEAP estimates animal pose,’ a new deep-learning based method from researchers at Princeton to track the movements of individual animals in an experiment, based on similar approaches for humans. There’s a graphic user interface to label a subset of raw images that train the algorithm; from there, LEAP can estimates pose and track body parts in unlabeled data. In flies, it worked better on some body parts than others, but the researchers were able to track 32 labeled reference points with less than 3% error in one example. They also analyze the gait dynamics of 59 male flies as well as unsupervised behaviors of 42 male and female flies in additional demonstrations. And it’s not just good for flies—LEAP could also follow mice freely moving in an open field.