Computing technique reconstructs 3D images from single photons reflected from dimly lit object.
Talk about taking a dim view of things. Researchers have obtained ultrasharp images of weakly illuminated objects using a bare minimum of photons: mathematically stitching together information from single particles of light recorded by each pixel of a solid-state detector.
The achievement is likely to support studies of fragile biological materials, such as the human eye, that could be damaged or destroyed by higher levels of illumination. The development could also have applications for military surveillance, such as in a spy camera that records a scene with a minimum of illumination to elude detection.
To create detailed images using single photons, electrical engineer Ahmed Kirmani of the Massachusetts Institute of Technology in Cambridge and his colleagues developed an algorithm that takes into account correlations between neighbouring parts of an illuminated object as well as the physics of low-light measurements. The researchers describe their work online today in Science1.
“The amount of information they’ve been able to extract is quite incredible,” comments experimental physicist John Howell of the University of Rochester in New York, who was not part of the study.
“We didn’t invent a new laser or a new detector,” notes Kirmani. Instead, he explains, the team applied a new imaging algorithm that can be used with a standard, off-the-shelf photon detector.
Light from dark
In the team’s setup, low-intensity pulses of visible laser light scan an object of interest. The laser fires a pulse at a given location until a single reflected photon is recorded by a detector; each illuminated location corresponds to a pixel in the final image.
Variations in the time it takes for photons from the laser pulses to be reflected back from the object provides depth information about the body — a standard way of revealing three-dimensional structure. However, the algorithm developed by Kirmani and his colleagues provides that information using one-hundredth the number of photons required by existing light detection and ranging (LIDAR) techniques, which are commonly used in remote mapping or measuring forest biomass, for instance.
“The paper illustrates some remarkable examples of this new computational imaging technique and could point a future direction for a number of single-photon depth imaging approaches,” notes photonics expert Gerald Buller of Heriot-Watt University in Edinburgh, UK, who was not involved in the study.
Because the laser produces light of a single wavelength, the technique produces monochromatic pictures, but to some extent it can distinguish different materials based on the rate at which they reflect the laser's colour. On average, darker regions require a greater number of pulses to hit them before one is reflected.
To simulate real-world conditions, the researchers used an incandescent lamp that created a level of stray background photons roughly equal to those that number reflected from the laser. To eliminate the noise, the team used various algorithms, which enabled them to produce high-resolution, 3D images using a total of about one million photons. By comparison, an image of similar quality taken with a mobile-phone camera under office lighting conditions would require a few hundred trillion photons, Kirmani calculates.
References
Kirmani, A. et al. Science http://dx.doi.org/10.1126/science.1246775 (2013).
Related links
Related links
Related links in Nature Research
Three-dimensional imaging of dislocations 2013-Nov-20
Photons detected without being destroyed 2013-Nov-14
Technology: Multiple exposure 2013-Oct-30
Related external links
Rights and permissions
About this article
Cite this article
Cowen, R. Stealth camera takes pictures virtually in the dark. Nature (2013). https://doi.org/10.1038/nature.2013.14260
Published:
DOI: https://doi.org/10.1038/nature.2013.14260