Main

Fuzz and speckle patterns befit pajamas but are unwelcome in imaging experiments. Biology thus tends to interfere with imaging because living things often absorb, scatter or locally bend—aberrate—light, says Tom Bifano, who directs Boston University's Photonics Center. This aberration hinders an experimenter's clear view of cells inside a zebrafish embryo or neurons firing action potentials deep in a mouse's brain. A family of corrective optical measures, known as adaptive optics (AO), is one remedy1,2. Although today's commercial microscopes lack fully integrated AO, some manufacturers tell Nature Methods that this is no distant dream and they are adding facets of AO to their instruments. Labs can also assemble their own AO (see Box 1).

As she and her team try to understand neural circuits in mouse visual pathways, biology offers daily puzzles, says Na Ji, who leads a group at the Janelia Research Campus and is moving her lab to the University of California, Berkeley. AO was crucial in her team's ability to accurately characterize in vivo the stimulus selectivity properties of thalamic boutons—vesicle-containing synapses—involved in signaling in the mouse visual cortex3. To image fine structures such as synapses at depth, “AO appears to be essential,” she says.

At some point in the not-too-distant future, AO will be an important part of biological imaging, says Peter Kner, a researcher and imaging methods developer at the University of Georgia. For now, he says, the options for biologists are far from “plug and play.” AO is emerging in microscopy; “I would say it's progressing,” says Nobel laureate Eric Betzig, who is at the Janelia Research Campus and moving to the University of California, Berkeley, in 2018. (He is Ji's spouse.) As is common in the early phases of technology development, he says, many approaches coexist and it's too soon to say which will persist and for which applications.

With adaptive optics, some, but not all, of the fixes in astronomy are transferable to biology. Credit: Philip Rosenberg/Getty

AO advances

Biologists can draw on astronomers' experience. When astronomers capture images of a distant star, light rays have essentially traveled in a parallel formation. They scatter upon hitting the Earth's atmosphere, says Bifano, distorting the wavefront—an imaginary surface perpendicular to the rays that should be flat. AO helps to correct this. To measure the non-flatness, some light is sent to a wavefront sensor; then comes correction with a wavefront modulator such as a deformable mirror (DM). AO in biology faces numerous challenges: light rays don't travel in a well-defined layer; tissues and media differ in their refractive indices (RIs); illuminated tissue can scatter light in many directions; aberrations are close to the imaged object, limiting the correctable field of view. Astronomers have some fixes and sophisticated approaches, says Betzig. “Some of it is transferable and unfortunately some of it is not and we have to kind of fumble our way forward.”

Biologists typically ask Bifano how AO can help them resolve a larger imaging volume, how to handle strongly scattering tissue and how to capture images quickly. The answers have to include the fact that progress on one front often means compromise on one or both of the others. For example, there are different light paths to contend with: the aberration of light is different at the center and the edge of an objective. Labs have optimized three-photon microscopy techniques and shown how AO improves systems' performance in deep imaging. Yet to obtain the needed illumination, the laser repetition rate has to be reduced, which shrinks either the image field of view or the imaging rate, he says. Some techniques allow single-shot, wide-field, extended-depth-of-field images at kilohertz rates, which could amplify the impact of voltage indicators and let researchers capture neural functions and longer range interconnections. But these single-shot microscopy techniques are limited to near-surface regions in scattering tissue.

Despite the challenges, AO has immensely improved researchers' ability to image zebrafish embryos, for example, which hints at future possibilities, says Kner. Practically, AO takes collaboration and iteration. When he and his team work with biologists to image fixed Drosophila samples, the group determines the area to image, then corrects the aberration. His lab also adjusts algorithms to optimize the point spread function for given imaging conditions4.

To image deeper with super-resolution microscopy, Kner and colleagues work on QSTORM AO, combining AO with stochastic optical reconstruction microscopy (STORM). Kner's team works on STORM aspects, while Jessica Winter and her lab at Ohio State University handle the 'Q,' the quantum dots. They offer more photons than fluorescent proteins or small-molecule dyes, says Kner, and “more photons give you higher resolution.” STORM AO might be used to image inside a Caenorhabditis elegans embryo with a field of view of a single cell and with 50-nanometer resolution. “That's sort of a dream,” he says.

AO correction schemes have to be designed with a particular microscopy approach in mind, says Kner. There is, for example, excitation and emission light to consider. In wide-field microscopy, an experimenter's main concern will be about emitted light. But light traveling to and into the biological sample can also scatter. Also, light may not converge on an expected focal point.

Ji says she has been enamored by the possibilities of applying fast volumetric imaging with Bessel focus for brain imaging and, when that is combined with AO, for maintaining high-resolution imaging deep inside the brain. She does her own work and collaborates with Betzig: his projects often involve cell biology, hers focus on neurobiology.

“Extraordinary” is how Bifano describes the approach by Ji, Betzig and Daniel Milkie of Coleman Technologies to overcome an important obstacle in wavefront sensing deep in tissue, by using an AO approach to send light to different segments of an objective's rear pupil5. It's an adaptation of coherent optical adaptive techniques from 1970s astronomy, he says.

Even though zebrafish embryos are more or less transparent, their small features appear blurred because of refraction index differences between their bodies and surroundings. “Betzig showed how well AO can fix this,” says Bifano. In both instances, he says, the corrected field of view is small, blur is reduced, the illumination signal is increased and the signal intensity increases nonlinearly.

Indirect, direct sensing

Whether working in super-resolution or diffraction-limited techniques, AO can address aberrations that arise during imaging inside a developing embryo or other multicellular environment, says Betzig. He combines AO methods with lattice light-sheet microscopy to do so. “You need basically independent adaptive optics correction for the light sheet and the detection,” he says. His lab's prototype is “three complex microscopes stuffed together”: an AO microscope for excitation, an AO microscope for detection and a lattice light sheet in between. It's not yet ready for biologists to use, but he and his team are designing a simpler prototype that might be.

Aberrations differ from one sample location to the next. The isoplanatic patch, the area with valid aberration correction, can be leveraged in a multi-conjugate AO approach, says Betzig. It works, but it's highly complex. In many regions of a zebrafish embryo, the isoplanatic patch will span around 50 microns, but it can be as small as 5–10 microns. That can suffice in cell biology but not for long-range neural connectivity questions. Generally, he says, experimenters image to determine which correction they “can get away with.” That correction will be valid for that specimen type and developmental stage. An experiment can involve capturing multiple corrected image volumes and then stitching them together.

AO will be an important part of biological imaging in the not-too-distant future, says Peter Kner. Credit: C. Wright

“AO is all about feedback,” says Betzig: a scientist makes a measurement and adjusts the signal with a deformable mirror or other wavefront-modulating element. When selecting a correction method, researchers will want to consider how fast they can make many corrections across a field of view and how long it takes to measure aberration. Too much aberration measuring risks bleaching the sample. Indirect wavefront sensing involves multiple measurements to determine which correction works, and those can “burn up” a specimen's photon budget. Direct wavefront sensing is photon-efficient and fast, he says; “obviously I have a bias.”

Direct wavefront sensing works well in the largely transparent C. elegans and zebrafish embryos that Betzig works with. The approach involves a Shack–Hartmann (SH) wavefront sensor, which is fast and lets an experimenter track aberration as an organism develops, says Ji. But direct wavefront sensing is more technically demanding and pricier to set up. Labs need a sensor, a DM, and a laser for generating a guide star, which acts as a reference for imaging, all of which together can add up to over $150,000. Eventually, she says, microscope companies will provide good, integrated direct wavefront-sensing solutions.

Indirect wavefront sensing is typically slower, and it can be less accurate, but it has fewer hardware requirements than direct sensing. Mainly, labs need a wavefront modulator, which costs around $20,000. SH sensors work well in samples that do not scatter much, says Ji. It has been possible to apply direct wavefront sensing with an SH sensor and near-infrared fluorescence as a guide star to reduce tissue scattering6. But she and her team try to avoid SH sensors to devise methods that work in transparent and opaque samples.

With direct sensing, a guide star helps to directly measure sample response7. Beads make good guide stars, but they are hard to deliver, and one cannot be certain where in the organism the bead will go, says Betzig. Some labs use peroxisomes for guide stars. These options “are really limiting,” he says. In his lab, using two-photon imaging to create a guide star “works great,” he says.

The research community has to show it enables new biology when AO is applied, says Na Ji. Credit: M. Staley, Janelia Research Campus

Aberrations in biological samples can get complex. Imaging dynamics change over time, so corrections applied to the image tiles need to be updated. Nuclei have a higher refractive index than cytosol, which makes imaging tissue with many nuclei like imaging a bag of marbles, says Betzig. Even in an optically tractable specimen like a zebrafish embryo, light can be scrambled, which confuses the wavefront sensor. Instead of the guide star, the sensor 'sees' a speckle pattern—“a really complex, ugly speckle pattern,” he says. He and colleagues have found possibilities for improvement by moving the guide star spot around, then averaging out the speckles8. “What's left is something the sensor can handle,” says Betzig. The approach does not deliver perfect correction, but it's more useful than leaving the guide star in one spot.

With highly scattering samples, the image can become “just a fuzz,” says Betzig. That's when labs can try indirect wavefront sensing or serial direct wavefront sensing, which he and Ji co-developed and which is suited for imaging deep in the mouse brain, for example. It can be combined with red fluorescence from fluorescent proteins or injected red dye to reach greater depths6.

There is a continuum between a well-defined guide star and a fuzz. Researchers sometimes talk about the ballistic component of light, says Betzig—the component of light that focuses as opposed to the component that scatters into fuzz. How much light is ballistic and how much fuzz develops are connected to factors such as imaging depth and wavelength. With confocal microscopy, indirect or direct sensing can work, but confocal microscopy is defeated when there is much scattering.

AO promises to take the researcher's gaze deep into a sample, but how deep, exactly, is a moving target, says Ji. It depends on the labels—green, red or near-infrared—and the label density. It's easier to image deeply in more sparsely labeled samples, and three-photon fluorescence microscopy images more deeply than two-photon approaches. And it also depends on how much power a lab is willing to deposit into a sample. Imaging with AO can give a better and deeper view than imaging without AO. For imaging the mouse brain, she expects AO will offer a view hundreds of microns deeper than conventional imaging does.

Ideally, a microscope converges a planar wavefront (red line in a) into a spherical wavefront (red semicircle). The sample can scatter light rays, distorting the wavefront (b). Optical elements can cancel these aberrations (c). Credit: Ji lab, Janelia Research Campus

Ji applies both indirect and direct methods. In her in vivo imaging experiments, indirect wavefront sensing works because aberration stays constant over the course of an experiment. Calcium imaging experiments in the brain might last for hours with neurons filled with bright, photostable indicators that can be repeatedly imaged without apparent damage or bleaching. “In such samples, if aberration measurement takes a couple of minutes, it's not a big deal at all,” she says. But when labs image a few single molecules in vivo, it becomes important to limit light exposure. “The realist in me is happy to see any examples where image quality improves after AO correction, but the purist in me wants to know if the method used has fully corrected the aberration within the constraint for the correction device. And if not, why not?” She wishes that labs would always test their method by introducing several known, relatively complex aberrations into the system and checking whether their method can correct them.

As a community, says Ji, researchers have to demonstrate that AO makes a difference such that “new biology can be enabled when and only when AO is applied.” That approach goes beyond just showing that AO leads to nicer-looking images. This matters especially given that brighter and longer-wavelength fluorescent probes, as well as longer-wavelength excitation schemes, are letting scientists do in vivo imaging at increasing depths without using AO. “If we cannot show biology relevance, eventually our work will stay confined to the optical domain,” she says. That would be useful, but disappointing, because she and others want to see potential impact on biology. “Biology is a lot more complex and open-ended than technology development, and there is always so much to learn about the brain,” she says. “I am glad that I am doing both so that I never have a boring day.”

Mitochondria (magenta) and the plasma membrane (green) in a cell 150 μm deep in the zebrafish hindbrain imaged without AO (top) and with AO and deconvolution (bottom). Credit: Betzig lab, Janelia Research Campus

Commercial views

A number of companies, including Imagine Optic and Thorlabs, sell optical components for labs building AO on their own. Thorlabs also has an AO kit, which came together in a partnership with Boston Micromachines, a company Bifano co-founded. With the kit, his company sought to help lower the cost barrier for researchers, says Thorlabs senior research engineer John Taranto. BU's deformable mirrors complemented the wavefront sensors Thorlabs was selling as well as their optomechanical products and laser sources. “Our kits are primarily educational in nature,” says Taranto, intended for scientists starting out in AO. Users tend to integrate components such as wavefront sensors and deformable mirrors into their own optical systems, such as a multiphoton microscope.

Today's scientists won't have to wait a lifetime for commercial microscopes with integrated AO, says Ingo Kleppe, a researcher in Zeiss's advanced development unit. Future advanced commercial microscopes will have integrated AO. Such instruments need to be reliable for the greatest number of samples, work robustly and deliver significant enhancements. “This is where scattering plays a big role,” says Kleppe. If scattering is the dominant aberration challenge in labs, that can limit the applications and the extent of AO-based enhancements.

Leveraging AO is an exciting tactic across the life sciences, especially in neuroscience, where researchers are increasingly showing what can be gained from deeper in vivo brain imaging, says Kleppe. This is motivation to keep exploring how AO can improve imaging at greater depths and address aberration and scattering. Multiphoton microscopy and tissue clearing have helped address some challenges. Scattering is usually mathematically described in static ways so as to reduce the complexity of a chosen correction, he says. Research on optical phase conjugation shows how to correct for scattering but also reveals its limits in terms of field of view, depth and computational demands.

Top left: a wide-field image of a fruit fly nerve cord bouton: no AO correction. Top right: the corresponding STORM image. Lower left: a wide-field image of a fruit fly brain lobe soma before wavefront-corrected STORM-AO; lower right: the reconstructed STORM image. Credit: Kner Lab, Univ. Georgia; reproduced from ref. 4 courtesy of The Optical Society.

Kleppe and his team have been improving the Zeiss LSM 880 laser-scanning microscope, with Airyscan. The details are proprietary, but he says that the Airyscan detector in nonlinear optics mode yields higher-quality images at greater tissue depths than was previously possible with the standard mode. Further enhancements await as the developers work out which aberrations AO can handle best.

AO lets scientists reduce aberration and increase accuracy in their imaging experiments, says Hilmar Gugel, who manages optics development for laser-scanning microscopy at Leica Microsystems. But, he says, scientists tend to overrate the possibilities of AO and underestimate the difficulties. Many sample-based aberrations are not independent of the image field, which means that introduced aberrations can differ from one point or pixel to the next. “This means the wavefront needs to be corrected for each individual pixel, which is time consuming and slows down the imaging speed significantly,” he says.

When using AO to reduce the effect of wavefront distortions, one prominent facet of the problem is spherical aberration due to refractive index mismatch of, say, the specimen and the immersion medium, says Gugel. Such issues can be addressed with correction collars. An increasing number of Leica objectives now have motorized collars, which the company introduced in 2011. They enable automated depth-dependent correction for spherical aberrations, he says. Full AO will enhance image quality further by correcting field-dependent wavefront distortions. Implementing those corrections now, however, would hinder fast scanning in Leica's confocal microscopes, such as the fast galvanometer-based scanning with line frequencies up to 12 kHz.

What matter more than the aberrations of high-end microscope optics, says Gugel, are specimen-induced aberrations, especially when imaging at depth. If those can be reduced and limitations to implementing AO can be overcome, higher image quality will result.

The scattering properties of brain tissue can cause loss of focus in stimulation light as well as emitted fluorescence, which changes the local power density, says Brendan Brinkman, senior marketing manager at Olympus America life science microscopy. Multiphoton stimulation in particular can benefit from optimizing corrections for refractive index mismatches and scattering, whereas photon sensitivity and activation are much more sensitive to local power density with multiphoton as compared to visible stimulation wavelengths, he says.

More detail is visible at greater depths when the laser-scanning microscope with Airyscan runs in nonlinear optics mode, says Zeiss's Ingo Kleppe. Here, phalloidin-stained muscle in a cricket embryo. Credit: S. Donoughe, C. Extavour, Harvard Univ.; S. Gliem, Carl Zeiss Microscopy

The Olympus approach is to use correction collars in microscope objectives. In late 2017, the company released an automatic correction collar adjuster for the Olympus FVMPE-RS multiphoton laser-scanning microscope. This adjuster optimizes the correction collar using a contrast method to correct for RI mismatch and scatter in the sample, says Brinkman. The correction throughout a z-stack can optimize imaging throughout the sample volume. “This has been tricky in the past since adjustments to correction collars typically shift the focal plane slightly, making it a challenge to do full image reconstruction when the correction collar needs to be changed at multiple depths through the sample,” he says. Now scientists can adjust for RI and scattering in each plane throughout the volume. “Of course full AO allows for RI adjustment across a given plane, not just as an average.”

The correction collar adjustments maximize signal or contrast in a single plane, says Brinkman. Full AO would show benefits in terms of speed and flexibility of automatic-sensing AO, especially samples areas with non-uniform RI. The speed of scanning laser microscopes and deformable mirrors certainly lend themselves to this, he says. Automatic feedback, whether through indirect or direct sensing, “could be a real value,” he says. But the complexity of such solutions, especially when fully optimized based on calculations and theory, can be a barrier. “It may still take some time, but the promise of AO would be worth the wait.”