Nat. Neurosci. 21, 1281–1289 (2018)

DeeperCut, a pose-detecting machine learning algorithm designed for the human body, has gone to the animals. To make analyzing behavior less tedious, the creators of DeepLabCut adapted a subset of the original algorithm’s deep neural network to track user-defined body parts from video recordings of any moving animal, without requiring markers on the animals themselves.

In their Technical Report published in Nature Neuroscience, the team presents DeepLabCut’s ability to track the noses and paws of mice as they sniff an odor and grasp a joystick and to follow points on a fruit fly moving around a small cubicle. Humans still need to label the initial training set to teach the algorithm what it’s looking for, but DeepLabCut can transfer what it’s taught to additional behaviors from there with minimal additional input. The code for DeepLabCut can be found on GitHub.