Main

As a worm embryo grows ready to hatch, newly formed neurons and muscles start to twitch, sometimes as fast as ten times a second. The jiggling miracle is a nightmare for microscopists, explains Hari Shroff at the US National Institute of Biomedical Imaging and Bioengineering. Visualizing the full volume of the developing embryo requires taking a series of horizontal slices from top to bottom. Unless the camera takes pictures faster than the embryo twitches, cells appear to jerk from one side of the embryo to another.

Derek Toomre at Yale University is studying much smaller cellular events: how intercellular vesicles arrive at and fuse with the cell membrane, where they release signals to neighboring cells. The entire process takes less than a second; the assays are just a few milliseconds per capture.

Eric Betzig at Janelia Farm, part of Howard Hughes Medical Institute, is inventing new kinds of microscopy that resolve details previously inaccessible to fluorescence microscopy. He and colleagues recently described the use of flat sheets of light to make three-dimensional images of chromosomes in dividing cells1. This requires taking over 200 high-resolution images per second. For this to work cameras have to keep up with molecular movements inside cells, he says. “As your spatial resolution improves, the temporal resolution has to improve at the same rate. If I want 50-nanometer resolution instead of 200-nanometer, and I don't want smears, I need to snap my cameras four times faster.”

These are just three of many scientists turning to cameras equipped with new, faster imaging chips called sCMOS sensors (CMOS, for complementary metal-oxide semiconductor, refers to a way of manufacturing integrated circuits that has been in use since the 1960s. The term scientific CMOS or sCMOS was first used around 2009 to refer to advances in design, fabrication and performance.)

sCMOS sensors are making a difference, says Toomre. “Before you got about a dozen small images, and now you get a hundred huge ones. We can see small movements and local jiggling that wouldn't be possible without the high temporal resolution.” The faster frame rate will also allow Shroff to speed through volumetric imaging on larger zebrafish embryos, which are more than ten times the diameter of worm embryos.

Ruffles at the membrane of a living COS-7 cell in a Bessel beam plane illumination microscope. Credit: L. Gao, Betzig lab

It is not just the faster frame rate that makes new kinds of experiments possible, says Philipp Keller at Janelia Farm, who is developing microscopy to reconstruct cells' dynamics as zebrafish and fruit fly embryos grow from one to tens of thousands of cells. sCMOS sensors combine advantages that previously had been considered mutually exclusive: fast frame rates are achieved with low noise, and sensors encompass a large field of view. “These three features are basically the perfect combination that you would want to use in a light microscope,” says Keller.

sCMOS image showing microtubules in a fixed cell. Scale bar, 10 μm. Credit: A. York, S. Parekh and C. Combs, Shroff lab

Andor Technology, Fairchild Imaging (now owned by BAE Systems), Hamamatsu Photonics and PCO released their cameras late last year, and other companies are getting into the market. Hamamatsu, Photonis and QImaging, for example, have all announced new product launches for the last quarter of this year. Parameters vary (Table 1), so researchers will need to do their homework before getting a new camera. Also, performance in a particular laboratory may not match reported values, says Keller. “The specifications that you find in the white papers are not always representative of what you get out of the box.” Getting these new cameras to function properly takes some work, says Betzig. “I wouldn't recommend them to anyone until the bugs—bad pixels, poor bit depth and practical limits to high-speed operation—are worked out.”

Table 1 sCMOS sensors from selected manufacturers (as of November 2011)a

Scientists should also remember that not all microscopy applications will benefit from the new technology. sCMOS sensors are not generally considered sensitive enough for extremely low-light applications or for single-molecule tracking, which currently requires electron-multiplying charge-coupled devices (EMCCDs), a kind of CCD designed for higher sensitivity. For exposures of a second or longer, the benefits of sCMOS sensors are minimal. Another, less expensive type of camera, the cooled interline CCD camera, produces less noise over very long exposures, such as those used in bioluminescence microscopy. Also, sCMOS sensors are not applicable in the point-scanning techniques as used for two-photon confocal microscopy and stimulated emission depletion super-resolution microscopy.

That still leaves many applications for which sCMOS cameras can be useful. These include total internal reflection microscopy, spinning-disk confocal microscopy, light-sheet microscopy, structured illumination microscopy and certain forms of super-resolution microscopy that rely on wide-field detection. In short, sCMOS sensors have the potential to expand the information scientists can get from their microscopes, says Colin Coates, marketing manager for Andor Technology, which makes many types of microscopy cameras. “It's the start of a snowball rolling.”

How sCMOS sensors work

At the most basic level, both CCD and CMOS cameras work the same way. Photons strike pixels arrayed on a sensor, causing electrons to carry their signal. In CCD sensors, each pixel is read out through a common port, limiting the speed at which signal can be collected. In CMOS sensors, signals are conveyed by columns or by half-columns, an arrangement that makes for faster transfer. Until recently, however, CMOS sensors were considered too noisy for many types of microscopy; they detected too many photons that did not exist, and they are still too insensitive for applications that require detecting single photons.

Vesicles (bright spots) fusing with cell membrane (purple). Fast cameras and a wide field of view are important for studying fusion. Credit: D. Toomre lab

In 2007, Andor, Fairchild and PCO began a collaboration to create a CMOS sensor with very low noise, implementing some innovations in sensor architecture that had been developed at Fairchild. Workers at Andor and PCO contributed their own ideas of how the sensor should operate as well as market savvy about what kind of sensor chip could be used for the largest variety of cameras. (In addition to microscopy, sCMOS technology is being used for DNA sequencing, machine vision, solar-cell quality control, measuring velocity, surveillance and other applications.) Together, these three companies designed a sensor chip, which each company now installs in its own cameras. The sensors, which are about the size of a thumbnail, can collect 5.5-million-pixel frames as fast as 100 frames per second with noise rates as low as about 1.4 electrons per pixel per exposure. This is a triumph of engineering, says Gerhard Holst of PCO. “I would have said five years ago that this is not possible with a CMOS sensor.”

Light-sheet microscopy recording of a fixed Drosophila melanogaster embryo: the specimen was stained for chromatin (cyan) and for the transcription factor Dorsal (magenta). Specimen provided by members of S. Shvartsman lab. Credit: P. Keller lab

Getting the noise down was a matter of optimization along the entire readout path, says Boyd Fowler at Fairchild Imaging. “We just did careful engineering at every step.” But the chips also use a clever readout scheme that minimizes noise while still allowing a wide range of detection. The dynamic range, or the ratio between the largest and smallest measurable values, is 25,000:1 at 30 frames per second. What makes such a large dynamic range possible is the fact that each pixel can operate in either a 'low-gain path' or a 'high-gain path'. The former can be used to detect very large numbers of photons, albeit with higher noise. The latter can be used to detect a lower maximum number of photons but with high sensitivity, explains Fowler. “If you add these together in an interesting way, you can get a high dynamic range with very low noise. That's the trick we did.”

A single vesicle from the opening of the fusion pore to full fusion. Scale bar, 1 μm. Credit: D. Toomre lab

Although the sCMOS sensor from Andor, Fairchild and PCO is perhaps the most sophisticated, other manufacturers boast higher sensitivity or faster frame rates, or argue that simpler circuitry and lower price points offer considerable advantages. Meanwhile, investigators are still getting familiar with the cameras, and microscope and high-content imaging manufacturers are learning how to adapt their instruments and software to these new sensors. “It's not an easy technology to tame,” says Coates. “You'll go through a year or two of a lot of demonstrations, and then you'll build up a name for yourself as the technology becomes more known,” he says, describing how companies like his plan to promote the new cameras. Eventually, he says, the demonstrations will give way to repeat sales plus sales based on recommendations from colleagues and microscope manufacturers, and sCMOS cameras will become standard pieces of equipment.

Growing pains

Installing an sCMOS camera requires several adjustments. Indeed, many researchers are not using all the pixels or the fastest possible speeds because doing so would interfere with other aspects of their experiments. “Before, the limit was camera speed,” says Derek Toomre at Yale University. “Now the problem is, 'how do I handle the data; how do I get automation, and how do I get enough photons'?”

The first part of handling the data is dealing with different types of noise in sCMOS sensors. Signal is read through many analog-to-digital converters, and each adds its own source of variability. If a researcher shines a constant white light onto the sensor, differences between the columns mean that the uncorrected image will show stripes. Each pixel also has an inherent amount of noise. To some extent, stripes, noise variations and bad pixels can be compensated for using software, but corrections can be complicated. Separate adjustments are required for different intensities of light, for example. Early adopters at Janelia Farm wrote their own error-correction algorithms; eventually, though, the cameras should be able to make such corrections automatically. Manufacturers have already begun to introduce error-correcting algorithms with the cameras, such as the option to disregard data from insensitive pixels and substitute data from adjacent pixels. However, warns Betzig, this comes at the cost of lower spatial resolution and the introduction of artifacts in the image.

Photon management is another problem. Cameras that operate ten times faster require ten times more light, light that damages cells and developing embryos, explains Shroff. “The question is, 'can your biological sample take that'? If you have a regime where you're almost killing the thing you're looking at in a high-speed EMCCD [sensor], going to an sCMOS [sensor] is not going to buy you much more speed.” An sCMOS camera lets Shroff image two or three times faster than an EMCCD camera does, but the limit is phototoxicity, not camera speed.

This problem is compounded by the fact that sCMOS cameras are less sensitive to photons than are EMCCDs. The most sensitive EMCCD cameras detect roughly 9 of every 10 photons that hit a pixel, a parameter known as quantum efficiency. In contrast, reported quantum efficiency values for sCMOS cameras vary from just over 50% to 70%. But quantum efficiency must be considered alongside other factors, says Mark Hobson of Hamamatsu, who believes that sCMOS cameras are likely to outcompete EMCCDs, particularly as performance of sCMOS sensors improves. “If you have high quantum efficiency and low noise, you can make an image with useful data with less input light.” In addition, he says, researchers should look at the quantum efficiency for the colors of light they use in imaging. Quantum efficiency is traditionally reported for its best wavelengths, usually in the green part of the spectrum. Many researchers, however, prefer to use less-damaging red wavelengths.

Several companies now sell sCMOS cameras. These include products from (clockwise from top left) QImaging, Hamamatsu, PHOTONIS, Andor, PCO and Fairchild.

Another complication is data collection. Theoretical acquisition rates are on the order of 1,000 megabytes per second. Although this has not yet been achieved, the actual data stream from most sCMOS sensors is a challenge to handle. Companies have developed different solutions. Andor allows 4 gigabytes of data to be collected on the camera, about 4 or 5 seconds at 100 full-field frames per second, and uncompressed data can then be transferred to computer storage at a slower rate. For certain cameras that use compressed data, about 100 frames per second can be transferred to a computer continuously via a specialized image-acquisition card with a 'cameralink interface', also called a framegrabber. Cameras including those from PCO and Hamamatsu, compress data as well as supply two ports that can simultaneously be connected to cameralink interfaces. PHOTONIS, with a camera that boasts a thousand frames a second, is looking into upgraded new interfaces, such as CoaXpres. Data can also be stored on the PHOTONIS camera.

Investigators are making accommodations for this amount of data. Shroff, for example, spent about $5,000 to purchase a custom-ordered redundant array of independent disks (RAID) system, which can write multiple disks at once. Janelia Farm's Keller built customized workstations to handle the 10–100 terabytes of data that could be collected in a day of continuous imaging. Still, scientists should not plan on doing their first experiments the same day they unpack their new computers. Getting all the necessary drivers and software installed and running is time-consuming and frustrating, say the researchers. Other electronic components such as those controlling lasers, shutters and scanners also have to be synchronized with the camera, says Keller, and limitations in the software control framework of many first-generation sCMOS cameras mean that small deviations from standard workflows can cause the entire system to crash during an experiment.

Different settings of the camera present different glitches. For example, the sCMOS imaging sensor from Andor, Fairchild and PCO operates in two modes: 'global shutter', a snapshot-like setting that records the readings from all pixels at a given instant simultaneously, and 'rolling shutter', which reads pixels in consecutive rows. The latter requires 10 milliseconds for the exposure to reach from the top and bottom of the sensor to the center of the sensor, so if objects in the field of view are moving faster than that, they will appear blurry. Global shutter can also be used to record the exact timing of events across the field of view. However, this mode halves the maximum frame rate and increases noise; investigators say that its operation has more bugs than rolling-shutter mode. Though researchers are instinctively more comfortable with the idea of global shutter, the better-performing rolling-shutter mode will be suitable for all but the most specialized applications, says PCO's Holst. After all, he says, most researchers must already consider a similar problem. “If they didn't have any issue with the EMCCD, they won't have a problem [with rolling shutter],” he says.

Once data are safely stored on the hard drive, the problem shifts to analysis. For example, says Holst, the high dynamic range sometimes leads researchers to think cameras are performing poorly. The problem is that many display systems represent a dynamic range of only about 250:1, about what the human eye can handle. Researchers will need to adjust their imaging software into nonlinear scaling to see the gradations. “You have to get used to the potential,” Holst says.

Researchers with experience with early models of cameras say effective dynamic range can be considerably reduced by noise and variability but agree that the potential is impressive. One example of the potential is imaging multiple cells with widely varying levels of fluorescence. In toxicity studies that use cellular staining, cells undergoing cell death may exhibit a tenfold increase in fluorescent staining. A higher dynamic range means researchers can get more information from each image, even without long exposures, says Pavel Fomitchov at GE Healthcare, which recently incorporated an sCMOS camera into its high-content confocal imaging system, IN Cell 600. “It allows researchers to image very dim and very bright cells in one shot.”

At least one company is betting that researchers will opt for simpler solutions that supply less data. QImaging launched its Rolera Bolt in October with an sCMOS imaging sensor manufactured by Sony. Rather than maximizing frame rates or numbers of pixels, the company prioritized making low-cost cameras that are easy to install and operate, says Deepak Sharma, marketing director for the company. Pixels do not read out through both high-gain and low-gain paths, and so each column has one analog-to-digital converter instead of two. This reduces dynamic range, admits Sharma, but it also reduces the signal variability. That, combined with other design features, lowers the variability often observed from column to column. “You never see the difference [between columns] because variation is within the read noise and thus not apparent,” says Sharma. These cameras also collect a fraction of the data of other sCMOS sensors, a maximum of 30 frames per second with 1.3 million pixels. That is several times faster than commonly used CCD cameras, but still produces a data stream that can be transported to a computer via a straightforward USB 2.0 connection.

Each camera also has distinct qualities beyond sensor and software. Andor's cameras can be cooled to −40 °C, which reduces 'dark noise', or a false current that can be misinterpreted as photons. Dark noise can considerably reduce signal-to-noise ratios at exposures of longer than about 50 milliseconds. In contrast, cameras without the extreme cooling components are not just cheaper but smaller, which can allow multiple cameras to be mounted on the same microscope, the better to quickly collect data from different parts of a sample.

The best way to pick a camera is for scientists to test it, preferably in their own labs with their own setups. This can help researchers appreciate how much of a problem noise might be and also to understand how well a camera synchronizes with lasers and other parts of the microscope setup.

Excitement around sCMOS cameras is generating its own hurdles, however. Scheduling a demonstration model can be difficult; camera manufacturers say they are having trouble keeping up with demand, and even investigators with close ties to industry report having to wait weeks. Moreover, with all the camera manufacturers racing to improve and supply instruments, researchers' greatest challenge may just be keeping track of their options. “This whole thing is in flux, and it's not like you can say this is the way it is now and that's it,” says Toomre.