What difference can a breakthrough in science make over two decades? A quick comparison of the respective fates of two discoveries made twenty years ago reaffirms how daft it is to try to predict research outcomes over such a timescale.

Both tales begin at IBM's Zurich research laboratory in the early 1980s. In one corner of the lab, Gerd Binnig, Heinrich Rohrer and others were building an instrument that would come to be known as a scanning tunnelling microscope (STM). In another, Georg Bednorz and Alexander Müller were doing experiments on materials that they thought might hold promise as superconductors.

In early 1986, Bednorz and Müller got the first hints that they were on to something. An oxide of lanthanum, barium and copper seemed to retain the ability to conduct electricity without resistance at temperatures of up to 35 K (−238 °C). This, at the time, was striking: for more than a decade, the ceiling on transition temperatures for superconductors had been stuck at less than 24 K.

Within months of publishing their first paper (Z. Phys. B 64, 189–193; 1986), the result was successfully replicated, and interest in its ramifications exploded. The buzz over high-temperature superconductivity at the March 1987 meeting of the American Physical Society in New York was such that The New York Times dubbed the event “the Woodstock of physics”.

By the time Bednorz and Müller picked up their Nobel prize in December of the same year, similar ceramic materials that could superconduct at the temperature of liquid nitrogen, 77 K (−196 °C), had been discovered. This, it was thought, would open the door to widespread practical use — and, briefly, industrial and government funding surged in at the speed of the magnetically levitating trains that the field was, according to countless news stories, due to produce in the fullness of time.

By contrast, few members of the public had even heard of the humble scanning tunnelling microscope, which Binnig and Rohrer first described in an internal IBM document in March 1981. It took until 1986, by which time the STM was already widely used to study materials, for them to receive the Nobel prize.

An STM exploits a quantum phenomenon called electron tunnelling: when two conducting materials are held close together and a voltage is applied, electrons hop from one to the other, producing a current that is highly sensitive to the distance of separation. An STM scans a sharp tip over a surface, translating the current registered into a topographical map of the surface. The instrument was powerful enough to reveal the positions of atoms.

But the STM only worked for materials that conduct electricity, which many of the things that interest biologists and materials scientists don't do. This problem was addressed by a paper published twenty years ago this week, on 3 March 1986. The paper, ‘Atomic force microscope’ (Phys. Rev. Lett. 56, 930–933; 1986), introduced the instrument that has fuelled the current explosion of interest in nanotechnology.

Atomic force microscopy is an immensely versatile technique.

The atomic force microscope traces topography by scanning a sharp tip over the surface. It measures tiny forces between the tip and the sample via the deflection of a thin cantilever to which the tip is attached. This is an immensely versatile technique (see page 14), and that first paper has clocked up more than 4,500 citations.

High-temperature superconductivity, on the other hand, never quite lived up to its commercial hype. The materials in question have been difficult to fabricate, their properties are more constrained than some physicists had hoped, and there is still no agreement on why they work as they do (see papers in this month's Nature Physics). The materials remain rich systems for experimental study and pose intriguing theoretical problems, but their practical use is largely confined to making superconducting quantum interference devices (SQUIDs) to measure magnetic fields, and prototype transmission lines that can carry high-density current.

In terms of their technological and economic application, it looks like the dark horse made it to the wire first. That should serve to remind managers of research agencies and industrial laboratories of the folly of trying to predict how much value any particular scientific breakthrough may hold.