Light microscopy is undergoing a renaissance, with a huge range of tools and techniques for gathering biological data with unprecedented speed and resolution. Michael Eisenstein takes a closer look.
Confocal microscopy is now a well-established technique for the three-dimensional imaging of cellular structures. But despite its success, the technique has its limitations when imaging live cells. The scanning process can greatly reduce imaging speed, for example, and the powerful lasers involved can damage the cells. Some manufacturers have addressed these problems by designing alternative systems such as 'restoration' microscopy (see 'Achieving clarity', below), whereas others have strived to improve confocal technology. Leica Microsystems of Wetzlar, Germany, for instance, still uses standard point-rastering in its high-end TCS SP5 confocal instrument, but it incorporates a resonant scanner for real-time 'true confocal' imaging with little cell damage.
An alternative approach uses a rapidly rotating disk, called a Nipkow spinning disk, which has numerous apertures to illuminate hundreds of spots simultaneously. This allows faster imaging with reduced photobleaching, although it does suffer from some loss in resolution. PerkinElmer Life and Analytical Sciences of Boston, Massachusetts, was among the first companies to develop instruments using this technology. Its UltraVIEW ERS microscope uses parts from leading Nipkow-disk manufacturer Yokogawa in Tokyo, Japan. The system is designed for live-cell confocal imaging and can be fitted with a sensitive electron-multiplying CCD (EMCCD) camera for detection. “This offers frame-rates in excess of 100 frames per second, if you're looking at relatively small areas in your sample,” says product leader Paul Orange. Olympus of Tokyo also makes a spinning-disk system, called the DSU, which features disks with slits instead of holes. Five different disks are available that vary in terms of slit number and spacing. “You can match the slit spacing to the numerical aperture of your objective,” says product manager Nicolas George. The DSU can also incorporate high-resolution EMCCD cameras for imaging live specimens at up to 150 frames per second.
The LiveScan SFC from Nikon Instruments in Melville, New York, uses arrays of pinholes or slits for multipoint imaging. These remain stationary while mirrors sweep the beam spots over the sample — a process known as swept-field confocal imaging, developed by Prairie Technologies of Madison, Wisconsin. Nikon is also introducing controlled light emission microscopy, which uses feedback from the detection process to modulate laser intensity to prevent oversaturation or unnecessary illumination of signal-free regions. “You sacrifice a little bit of temporal speed,” says Stan Schwartz, vice-president of Nikon's microscopy division, “but the casual user can make correct and beautiful images, and you significantly reduce photobleaching and phototoxicity.”
The LSM 5 LIVE from Carl Zeiss MicroImaging in Jena, Germany, uses a line-scanning approach that can image an area 512 × 512 pixels at up to 120 frames per second, or even faster for smaller sections, which means that rapid physiological events can be visualized. According to Bernhard Zimmermann, head of product management at Zeiss, this shift from point- to line-scanning has minimal impact on resolution for most studies. “If you're looking at intracellular vesicles, you will hardly see a difference,” he says.
Adventures in the n th dimension
These days a confocal microscope has to do more than simply track fluorophores. Multidimensional imaging is microscopy's new catchphrase, and the latest instruments gather information that goes beyond basic spatial orientation to provide time-lapse images and detailed fluorescence profiles.
The expansion in the range of available fluorescent labels means far more biological information can be colour-coded, and manufacturers have responded with advanced strategies for enhancing spectral resolution. For example, the LSM 510 META from Zeiss uses a 32-channel photomultiplier array behind an optical grating to generate full spectral signatures for each pixel; this is followed by further software-based separation. “You can even separate fluorescent emissions that peak at the same wavelength,” says Zimmermann, “as long as the individual emission curves are different.” Nikon offers a similar strategy with its C1si confocal instrument, which can achieve spectral resolution as low as two nanometres.
Leica enhances confocal spectral precision with proprietary technologies that include the acousto-optical beam-splitter, which controls excitation illumination in a filter-free manner, and SP module, which uses a prism-based approach for precise spectral separation. These will be enhanced by its forthcoming introduction of white-light lasers, which provide coherent, continuous-spectrum illumination. “This will give us total freedom in excitation and detection for all available dyes,” says Frank Olschewski, Leica's manager for confocal product development.
Fluorescence lifetime imaging (FLIM), which involves measuring the emission decay rate for an excited fluorophore, is also gaining interest for imaging. It can, for example, improve the quality of fluorescence resonance energy transfer (FRET) experiments. "When there is a FRET interaction, the donor lifetime gets shortened — this is an absolute, and it doesn't depend on the quantity or intensity," says John White of the Laboratory for Optical and Computational Instrumentation (LOCI) at the University of Wisconsin at Madison. "If you're doing other types of FRET measurements there are other reasons why you might get reduction in intensity or increased intensity, such as differential bleaching or compartmentalization." FLIM requires extremely high-speed photon detectors; Becker & Hickl of Berlin were an early leader in this regard, and Zeiss uses these detectors in its instruments. Leica also offers a FLIM attachment for its confocal instruments, which takes advantage of the SP detection system to provide 'spectral FLIM', with precise wavelength selection.
Standard fluorescence microscopes produce a focal spot that is ovoid rather than spherical. So although high-end instruments can produce images with x–y resolution of up to 180 nanometres, the resolution is considerably poorer along the z axis — around 500 to 800 nanometres — which reduces the quality of three-dimensional reconstructions.
One way round this uses two objective lenses, where the interference between the two optical wavefronts produces an effectively spherical focal spot. Stefan Hell and his colleagues at the Max Planck Institute for Biophysical Chemistry in Göttingen, Germany, demonstrated the effectiveness of this approach with their confocal 4Pi instrument, which can achieve a resolution of 100 nm in all directions. The original 4Pi design was relatively slow, barring its use in live specimens, but Hell and his team have since developed a multifocal, multiphoton version — MMM-4Pi — that scans samples with 64 foci, and images at a rate that makes high-resolution three-dimensional imaging of live cells possible. A 'user-friendly' commercial version of Hell's 4Pi system, the TCS 4Pi, is now available from Leica.
A related method, I5M, was developed by Mats Gustafsson of the University of California, San Francisco. I5M also achieves axial resolution below 100 nm, but is based on a wide-field configuration, with interference taking place over the entire field of view rather than at a point, and can be faster and potentially brighter than confocal 4Pi. Interference imaging produces artefactual 'lobes', which must be removed by deconvolution; these tend to be more prominent with I5M, and lobe removal is made easier by imaging with lenses that have a high numerical aperture. These are typically oil-immersion, and use of I5M is presently restricted to fixed specimens. Gustafsson's group has also developed a variant technique, I5S, which incorporates structured illumination to surpass the 'diffraction limit' in the image plane. Together, techniques such as I5S and confocal 4Pi represent important early steps in 'super-resolution' microscopy, an increasingly vibrant area of research and technological development (see 'Thinking big, seeing small').
The real rising star of imaging is multiphoton microscopy (MPM), a nonlinear imaging method. In MPM, target molecules are stimulated by a pair (or more) of low-energy photons virtually simultaneously, providing sufficient energy in combination to induce excitation and photon emission. This approach uses long-wavelength, near-infrared pulsed lasers that can penetrate deep into tissues with heavy scattering properties and that are less likely to produce phototoxic side effects. Additionally, multiphoton excitation results in dramatically reduced background fluorescence. Price remains a major obstacle to adoption, as pulsed lasers are extremely expensive, but those who have tried the method generally walk away impressed. “I think it's the best technique in town for looking deep into solid tissue and figuring out the dynamics of what's going on,” says White.
Zeiss currently controls the patent for femtosecond pulsed-laser MPM, and has made this technology available with its LSM 510 NLO. Several companies have also sublicensed this technology, and Olympus has made such an arrangement for its FV1000-MPE instrument, which was launched earlier this month. Leica also offers an MPM configuration for the TCS SP5, based on a separate patent that makes use of picosecond-pulse lasers.
Current MPM systems offer reasonable speed for live-cell work, but LaVision's TriMScope unit further improves imaging frame-rates with a multifocal approach that simultaneously scans samples with up to 64 foci, enabling the imaging of a 1004 × 1002-pixel field at 31 frames per second. TriMScope can also be adapted to perform FLIM or second-harmonic generation imaging, another nonlinear method gaining interest for its label-free imaging capabilities.
Scratching the surface
Some of the most interesting biological events take place near the cell surface, and for imaging beauty that goes only membrane deep, total internal reflection fluorescence (TIRF) microscopy is a strong option. A sample on a glass coverslip is illuminated either through a prism or an objective lens of high numerical aperture. When light enters the coverslip at an angle greater than the critical angle, as determined by the difference between the refractive indices of the glass and the specimen, the incident beam does not enter the specimen, but instead produces an evanescent electromagnetic field. This field can excite fluorophores close to the surface (within about 200 nm) of the coverslip, allowing non-destructive, live-cell imaging at the cell membrane with single-molecule resolution.
Most major microscope manufacturers now offer multi-wavelength TIRF modules for their inverted microscopes. These typically benefit from precise computer control of the angle and position of the excitation beam. Zeiss offers a compact, stand-alone option with its Laser TIRF system. TIRF instruments typically use oil-immersion lenses with a numerical aperture of 1.45, an effective minimum for good imaging. Recently, Nikon broke this barrier with a set of Plan Apo 60× and 100× magnification objectives. “We achieved a numerical aperture of 1.49,” says Schwartz, “and that's just under the theoretical limit for imaging with glass.”
Putting it all together
Modern imaging experiments have become very complex, generating huge amounts of multidimensional data. Although microscope manufacturers will typically provide reasonably effective tools for image processing and analysis, many users also opt for third-party software with more extensive analytical capabilities.
MetaMorph from Molecular Devices of Sunnyvale, California, integrates hardware control with a large toolbox of resources for multidimensional imaging, including a range of 'application modules' for image processing. “These are specific segmentation and analysis modules where, rather than having the user write macros, we've automated the entire process by having one dialogue box that helps segment your image and gives you relevant measurements,” says product manager Magali Tranié.
Imaris from Bitplane in Zurich, Switzerland, also uses a module-based approach for segmentation and analysis of multidimensional images. “You can click on any object that you see and immediately, in the same screen, get the statistics for that particular object,” says vice-president and director of sales Michael Wussow. “The same is true in reverse — we have an interactive sorting tab, where you move a bar on a histogram and things will appear or disappear depending on whether or not they meet your criteria.” Bitplane also allows users to code their own routines, and hosts a number of user-generated plug-ins on its webpage.
Improvision of Coventry, UK, recently released the latest version of its Volocity package, which consists of four products — Acquisition, Visualization, Quantitation and Restoration — that can also be integrated. Among Volocity's strengths are its object detection and tracking capabilities, and powerful rendering techniques. “Every voxel within the image set is computed by the Volocity rendering engine,” says senior marketing specialist Nicky Francis. “So it isn't a surface rendering technique, but a fully interactive volume-rendering technique. This makes it much more true-to-life.”
Numerous other powerful commercial options are available, but the academic community has also stepped up to provide a variety of open-source solutions. Among the most popular is ImageJ, developed at the US National Institutes of Health by Wayne Rasband. ImageJ is a multipurpose imaging tool, maintained by Rasband but powered by a highly active user community. “There are about 400 plug-ins on our webpage, contributed by more than 100 people,” says Rasband. Another popular program is VisBio, developed by the LOCI's Curtis Rueden for working with multidimensional data. According to Kevin Eliceiri, co-director of the LOCI, such tools complement commercial products. “The goal is to fill in the gaps,” he says. “There are needs that often cannot yet be met by the commercial community because they either represent too small a market or are emerging techniques.” Such projects have also meshed with the efforts of the Open Microscopy Environment (OME) to develop tools that aid collection and sharing of complex image data (see 'Tower of Babel').
Software — and hardware — requirements will only grow more severe as scientists attempt to coax increasingly complex data from smaller numbers of photons. Technology aside, the underlying challenges for this field remain clear and simple. “There are three issues in microscopy that never change — I want the best resolution, I want the best sensitivity and I want the best speed,” says Nikon's Schwartz. “And it's really difficult to get all three.”
Rights and permissions
About this article
Cite this article
Eisenstein, M. Something to see. Nature 443, 1017–1019 (2006). https://doi.org/10.1038/4431017a