Skip to main content

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

The author file

Jeffrey Mogil

Putting a face on pain in mice should improve our ability to measure it.

In the late 1980s, Jeffrey Mogil intended to become a pharmacologist. He entered John Liebeskind's laboratory at the University of California, Los Angeles, where researchers had access to strains of mice that exhibited varying pain thresholds. Soon after Mogil joined the lab, his advisor came across a paper explaining how to identify specific genes via selective breeding. Mogil was the newest graduate student; he expressed an interest and suddenly found himself getting into pain genetics, becoming perhaps the first scientist to study this systematically.

“For a long time no one cared,” Mogil recalls. “Pain researchers had never heard of genetics.” He used to go to the Society for Neuroscience meetings and stand alone by his posters. Then transgenic knockout mice started becoming ubiquitous tools for disease modeling, and researchers realized that understanding the genetic background of a new mouse strain was crucial for interpreting data. Suddenly everyone needed Mogil's data and insights. “Voila,” he says, “career.”

Later, as a professor at McGill University, Mogil showed, surprisingly, that mice have a heightened response to painful stimuli after observing a cagemate in pain. His team established that the way mice conveyed pain to each other was visual, but exactly how the communication occurred was unclear. “To see how mice could observe each other's pain, we had to see if we could observe their pain,” Mogil says. “Then we realized that that's a more important question.” Addressing that question resulted in a new way to measure pain, the mouse grimace scale (MGS; p. 447).

It was not hard to show that mice convey pain through their facial expressions, says Mogil. Faced with a collage of faces of mice in painful and pain-free situations, even a casual human observer can distinguish which set of mice is in pain, says Mogil, but being able to assess pain from a single photograph was considerably more difficult.

Mogil and colleagues broke mouse facial expression into parameters that could be scored by observations of eyes, nose, ears, cheeks and whiskers. Researchers using these parameters could predict whether images depicted mice subjected to pain stimuli with 72% accuracy. However, when researchers switched to using higher-resolution video cameras, their predictions became up to 97% accurate. The study would have been much quicker had researchers begun the project using a $600 camera versus a $300 camera, says Mogil. “I'm kicking myself for being so cheap.”

This is directly taken from a human scale. —Jeffrey Mogil

Mogil also wanted to know whether mouse facial expression could be used to study the emotional aspects of pain, which clinicians are eager to understand and mitigate. If facial expression conveyed pain-related emotion, Mogil reasoned, then lesions to appropriate brain areas should prevent grimaces after painful stimuli. Mogil's team created lesions in two regions associated with pain emotion in humans. This damage had no effect on grimacing. Next, Mogil came across evidence for a third brain area. Lesions in this area resulted in only a very modest effect. But just after viewing these disappointing data, Mogil received an e-mail explaining that two of the mice had lesions in the wrong brain region. “Suddenly this experiment that had failed actually had worked beautifully.”

Training people to use the MGS is straightforward, says Mogil. Even a novice can be trained to 75% accuracy within an hour. However, the process of rendering video into still images that can be used for assessment is time-consuming. Whereas getting results from a standard behavioral test may take only minutes, getting images appropriate for MGS can take hours or even days per experiment.

Clinicians seem particularly keen to see results using the MGS. “The clinical people love anything that reminds them of humans, and this is directly taken from a human scale,” says Mogil. Basic scientists, however, seem less receptive, perhaps because they have already established tests for studying pain and do not want to be reminded that these might measure behavioral responses that only indirectly assess desired clinical outcomes.

Currently Mogil is repeating classic experiments in the pain field to assess whether the MGS predicts efficacy in humans better than other measures and is collaborating with software engineers to develop algorithms that can automatically identify appropriate frames. He believes that face-based assessment can be sped up, but he is also hopeful that researchers may be willing to use more labor-intensive tests if data from such studies are more relevant to relieving human pain.

References

  1. Langford, D.J. et al. Coding of facial expressions of pain in the laboratory mouse. Nat. Methods 7, 447–449 (2010).

    CAS  Article  Google Scholar 

Download references

Authors

Rights and permissions

Reprints and Permissions

About this article

Cite this article

Baker, M. Jeffrey Mogil. Nat Methods 7, 415 (2010). https://doi.org/10.1038/nmeth0610-415

Download citation

  • Issue Date:

  • DOI: https://doi.org/10.1038/nmeth0610-415

Search

Quick links

Nature Briefing

Sign up for the Nature Briefing newsletter — what matters in science, free to your inbox daily.

Get the most important science stories of the day, free in your inbox. Sign up for Nature Briefing