This year's chemistry Nobel recognizes computer-modelling techniques that show, for instance, how a protein called lysozyme (left) cleaves glycosidic bonds (yellow). The method blends detailed quantum-mechanical calculations (right) with less-intensive computation to model the rest of the protein. Credit: Johan Jarnestad/The Royal Swedish Academy of Sciences

Computer modelling is one of the many scientific fields that Alfred Nobel, understandably, failed to anticipate in his 1895 will. And so, as Michael Levitt points out, “there’s no Nobel prize for computer science”. But computation’s increasing importance in chemistry and biology was recognized last week, when Levitt, of Stanford University in California, was one of three scientists to receive the chemistry Nobel for their work on ways to simulate the activity of large molecules — from cellular enzymes to light-absorbing dyes.

Google supercomputers tackle giant drug-interaction data crunch Twenty tips to help you interpret scientific claims How oral sex can cause throat cancer by transmitting HPV

“Computers in biology have not been sufficiently appreciated,” Levitt said at a press conference, joking that a fourth portion of the Nobel might have gone to the chip manu­facturers, who have driven up computing power exponentially.

Together with Martin Karplus of the University of Strasbourg in France and Harvard University in Cambridge, Massachusetts, and Arieh Warshel of the University of Southern California in Los Angeles, Levitt was honoured for a specific modelling technique: working out how to stitch together descriptions of molecules at close-up and zoomed-out scales.

The three were trailblazers in the 1970s. At the time, finely detailed quantum-mechanical pictures of bond making and breaking could not be calculated for more than a cluster of atoms — even today they are too complex to be computable beyond a few hundred atoms, and cannot be used to model whole proteins. So Levitt, Warshel and Karplus worked out how to merge these models with simplified simulations that treat molecules as non-reacting, vibrating atomic balls connected by springs. “The art is to find an approximation simple enough to be computable, but not so simple that you lose the useful detail,” Levitt says.

We are good at guiding experimentalists.

These multi-scale models have proved essential for studying the workings of enzyme reactions, and were pioneered in a 1976 paper in which Warshel and Levitt explained how lysozyme cleaves a glycosidic bond. Multi-scale techniques are not widely used in the drug industry, adds Kenneth Merz, who heads the Institute for Cyber-Enabled Research at Michigan State University in East Lansing. Instead, says theorist Christopher Cramer of the University of Minnesota in Minn­eapolis, they find uses in, for example, revealing how industrial catalysts work, or examining how light activates dyes on semi­conducting nanoparticles.

The award is also being viewed as an acknowledgement of the three scientists’ lifetime work in molecular simulation, researchers told Nature. “They have made theory an equal partner to experiment,” said theoretical chemist Gunnar Karlström of Lund University in Sweden, a member of the Nobel committee.

Still, a question mark remains over whether theorists can make predictions that surprise experimenters. Computer modelling “is really good at helping people understand why things work the way they do, but not so good at predicting new things. We are good at guiding experimentalists,” says Ken Houk, who uses computer programs to design new enzymes at the University of California, Los Angeles. 

Experimenters should be cautious of simulation results, agrees Warshel. But “one day every­thing will be done by powerful computers”, he predicts.

Cramer adds: “Every year, hazardous-waste disposal gets more expensive, whereas computing power gets cheaper. So the progress curves favour the theoreticians.”