In 1873, German physicist Ernst Abbe proposed that diffraction fundamentally limited the resolution that any microscope could achieve to around half the wavelength of light. And, despite many advances, microscopes didn't threaten to challenge this law of physics for more than a century. Stefan Hell, now a director of the Max Planck Institute for Biophysical Chemistry in Göttingen, Germany, was the first to show that the diffraction limit could be beaten.

Hell, while a postdoc at the University of Turku in Finland in the 1990s, thought that, with the right lasers, he could activate a fluorescent spot and then shrink it by superimposing a larger, hollow beam of light to deplete all the light emission except for that at the centre of the spot. He called the technique stimulated emission depletion (STED) microscopy. Although many physicists were initially sceptical of Hell's ideas, by 2000 he had used STED to produce the first nanoscale fluorescence images10.

Super-resolution microscopy has blossomed since, allowing researchers to see cellular processes unfolding at nanometre scales. “This is something that the field has desired since people first started looking through light microscopes,” says Jan Liphardt, a biophysicist at the Lawrence Berkeley National Laboratory in California.

Since Hell's work, the field has been boosted by other groups, including those of Eric Betzig, a physicist at the Howard Hughes Medical Institute's Janelia Farm Research Campus in Virginia and Jennifer Lippincott-Schwartz, a cell biologist at the National Institutes of Health in Bethesda, Maryland. In 2006, the groups reported that they had increased resolution by harnessing single-molecule photoactivatable fluorescent proteins and compiling images of thousands to millions of them11. They called the approach photo-activated localization microscopy (PALM).

At Harvard University in Cambridge, Massachusetts, physicist Xiaowei Zhuang has developed three-dimensional (3D) stochastic optical reconstruction microscopy (STORM), which uses photoswitchable probes to temporally separate the overlapping images of individual molecules and so boost resolution to ten times better than the diffraction limit.

Yet another approach — fluorescence PALM (FPALM) — was developed in 2006 by Samuel Hess, a physicist at the University of Maine in Orono. His group's technique involves looking at thousands of fluorophores at once, and localizing on small numbers at a time. These methods have already begun to demostrate their utility. For example, in 2007, Hess's group showed that FPALM could be used to detect proteins clustering in lipid rafts in living cells12. In 2008, Lippincott-Schwartz's group combined PALM with single-particle tracking to detect the movement of membrane proteins in live cells13. And Zhuang's group used 3D STORM to image microtubules and other molecular structures within monkey kidney cells14, later extending the method to multicolour 3D imaging of whole cells15. Hell's group, early in 2008, used the STED method to show the movement of synaptic vesicles inside living neurons at video rate16.

But the field is just warming up. “To people like me who were trained in physics or optics in the 1990s, it's just unbelievable that one can image below the resolution of light,” says Bernardo Sabatini, a neurobiologist at Harvard Medical School in Boston, Massachusetts. “The major revolution for the next 5 or 10 years is getting these advances to answer biological questions.

K.R.C.