Emerging standards in microscopy are being set up to address many pain points in the field. Credit: TEK Image/Science Photo Library

With a view to transparency and reproducibility in microscopy, scientists are hammering out standards to address, for instance, the surprises of fluctuating illumination power, the jungle of file formats, the mysteries of missing metadata and the diversity of camera outputs. A second story in this issue of Nature Methods focused on camera standards can be found here.

“We need standards,” says Roland Nitschke of the University of Freiburg. Developing standards in imaging is a noble deed that can make some eyes glaze over even beyond the glaze arising from long hours at the microscope. Those who feel they lack the time to pitch in on standards might be glad to hear that some not-so-distant developments stand to help microscopy users pull out their hair a bit less. Here’s a peek at how some emerging standards could address real-world pain points.

Standards development is not a task for one individual lab or institution, but needs to be a collaborative effort, says Caterina Strambio-de-Castillia, a researcher at the University of Massachusetts Medical School’s Program in Molecular Medicine and a Chan Zuckerberg Imaging Scientist. Just as biology itself “cannot be cracked by single labs,” standards development requires “bringing people together,” she says. Ultimately, those working on these efforts have to “speak with one voice.”

One spot where many voices combine on the path toward that one voice is in the meetings of the Quality Assessment and Reproducibility for Instruments & Images in Light Microscopy (QUAREP-LiMi) group1. “QUAREP is amazing, I never would have thought it would take only two years to grow it to its current 420 members,” says Nitschke, who co-founded the group. Members hail from 37 countries, from academia and from companies, and people ask to join on a daily basis. Most QUAREP members are in Europe. Around 25% are from industry: Europe is home to microscope manufacturers Leica and Zeiss and to many smaller companies that make imaging instrument components and software, especially in the UK, France and Germany. Nitschke is also involved with standards development in groups such as the International Standards Organization (ISO), which involves more US and Japanese companies than QUAREP. The ISO is an international non-governmental organization founded in 1946 with over 160 standards organizations as members.

Companies want to be at the virtual table where standards are worked out. “Such efforts are of mutual interest,” says Jürgen Reymann, who develops and manages data management and data analysis software in the life sciences at Leica Microsystems. He and his colleagues are part of QUAREP-LiMi and other organizations to stay in touch with customers, comment on technical questions and, internally, prepare new solutions that the research community discusses. Kees van der Oord, Europe support specialist at Nikon Instruments Europe BV, says that QUAREP-LiMi addresses the challenges that researchers face to repeat experiments reproducibly “at all levels of the hierarchy of the scientific world.” That can mean anything from working on instructions for calibrating instruments all the way to considering the best ways of including such information in scientific papers. “Even when only a part of these good practices are implemented, it will already have a positive effect on the progress in science,” says van der Oord.

With standards, many focus on reproducibility, which is important, says University of Dundee researcher Jason Swedlow. “I think the most important thing is ease of data access.” Discovery depends on using data, often for reasons beyond the original intention—AlphaFold is one such example, he says. “Having data in standardized formats accelerates research and makes what was impossible routine.”

Standards are alive

ISO: 21073:2019 is the first ISO norm for confocal microscopes. The specifications define the imaging performance in confocal single-point scanners using single-photon excitation. From start to finish, the microscope manufacturers Leica, Nikon, Olympus and Zeiss—sometimes nicknamed the ‘Big Four’—took part in developing this ISO norm. So did academics like himself, says Nitschke. The ISO activities sparked QUAREP-LiMi. The ISO norm is set up as a “living document,” he says, to change as technology changes. It defines limits, “but it does not really define the method, how you have to measure it.” An ISO norm sets the scene for other standards.

If you crave that new-car smell in your vehicle, the solution can be air fresheners that exude new-car aroma, which can keep it wafting for years. Over time, a car’s technical specifications remain unchanged, but it may no longer race up the hill as it once did. What has changed, says Nitschke, is its performance. Spend a healthy six-digit sum on a microscope and the same is true. With or without air freshener that gives off new-microscope smell, the specs stay constant over time. But laser performance, for example, dips. He sees this with the many microscopes in his keep at the University of Freiburg’s imaging core facility, the Life Imaging Center, and with instruments in the Microscopy and Image Analysis Platform (MIAP) he set up with colleagues. MIAP is an international network of imaging core facilities across institutions in Freiburg, Germany; Basel, Switzerland; and Strasbourg and Mulhouse, France. Since 2016, they have swapped experience and run courses, which are easier to organize across multiple imaging core facilities. But Nitschke was chagrined to hear recently that MIAP funding will be discontinued. Training can help researchers learn about changing functionality of microscopes and assess the instruments better.

A new confocal microscope might tell a user the instrument is bathing a sample at 10% illumination output, says Nitschke. That’s a percentage of the instrument’s maximum output. Some years later, when the laser has aged, he says, the same instrument “still tells you 10%,” but that’s 10% of what? Gas lasers, for example, last around three or four years. Laser power in solid-state or diode lasers, too, will generally decrease over time. Power can fluctuate. But microscope users cannot readily keep tabs on a microscope laser’s performance. Over time, the laser’s performance may have been dropping imperceptibly, he says, and “suddenly the power is completely gone.”

Illumination encompasses aspects such as wavelength, intensity and the power deposited in a sample over time. Illumination is the major source of damage to biological samples during imaging, which is why users want accurate illumination power data. But there are around 10 or 20 ways to measure power, he says. Measurements will vary depending on how close the power meter is placed to the light source and whether the measurement is made with an objective on or off the microscope. Each measuring mode can give a different readout. At a core facility like Nitschke’s, with 19 microscopes, there are not enough personnel to check each instrument for multiple hours a week. And for such a task, a service visit from a microscope manufacturer would be unaffordable.

Earlier this spring, QUAREP-LiMi shared a public protocol for measuring illumination stability and linearity using calibrated external power sensors. Nikon’s van der Oord is working on a macro for NIS-Elements, Nikon’s microscope software package, as a way to measure laser power on a Nikon confocal microscope with a Thorlabs power meter. The method, which could be adapted for other microscopes, is still in the works. Standardizing the way illumination power is measured is a step toward enabling one scientist to reproduce the work of a colleague using a different imaging system. Publications and other communications should include all relevant details needed to reproduce the experiment, says van der Oord. Imaging standards are vital in this process and give “an unambiguous way” of exchanging information.

Perhaps, says Nitschke, “clever tools” on a microscope could apply an illumination standard automatically, take self-diagnostic standardized measurements, write them to a database and let a user know when it’s time to call a ‘Big Four’ maintenance engineer. An instrument, he says, might even self-regulate and turn its own knobs to keep laser power constant over time. The user would get consistent power, and a maintenance engineer would have a history of an individual instrument’s functionality in terms of illumination power. But, he says, there are at least 50 optical elements between the laser box and the microscope through which one can alter power. Measurements need to be taken at the point nearest to the sample. Standards development can enable useful technology, he says, but the process needs to address implementation details, too.

Standards development takes place “by bringing people together,” says Caterina Strambio-de-Castillia. Credit: UMass Chan Medical School

File formats for sharing

As Strambio-de-Castillia wrapped up her time in Günter Blobel’s lab at Rockefeller University and then as a fellow in Michael Rout’s Rockefeller lab, she used the now more commonplace confocal microscopes, digital cameras and automated image acquisition to study the eukaryotic cell’s nuclear pore complex. “You could produce tons of images but what do you do with that?” Image analysis was difficult.

Some scientists chose commercial software but, she says, the packages tend to be “black boxes” and don’t let researchers poke, prod or tweak the algorithms. Scientists began building their own tools, for example with Matlab and other approaches. “Everybody was trying to do their own thing,” she says.

The Java-based ImageJ analysis tool emerged, and the Fiji package followed. A lab-mate in the Rout lab developed a sophisticated image analysis tool. When he left, other team members were less adept at using it. Later, when she moved to another lab, she couldn’t open the files she had generated on Rockefeller’s microscopes. “I said,” she recalls, “‘This is not the way to go; we cannot reproduce anything that has been done’.”

Strambio-de-Castillia and others want to use microscopy to explore ever more aspects of cells and cellular substructure and make it easier to capture all steps in an imaging experiment reproducibly and transparently, she says. Scientists want to convince themselves that their findings are accurate, and share results and how they were obtained with others. “You should be able to see the same phenomenon, regardless of microscope,” she says. To do so, however, takes standards. In QUAREP-LiMi, she spearheads working groups in which academics and companies discuss how to standardize, microscopy aspects such as image acquisition details, file formats and the many settings and metadata, the information about how the experiment was performed that modern microscopes capture.

When she discovered the Open Microscopy Environment (OME), Strambio-de-Castillia was joyous about this community-driven initiative for imaging standards. OME develops tools to help with managing microscopy data. It involves academic labs and some companies. Swedlow, who co-founded the organization, describes it as a “committed, dedicated community of scientists and technology developers trying to define and build tools that make bioimaging do real work to accelerate and deliver discovery.” Swedlow and colleagues have suggested guidelines for open image data and tools2.

OMERO is OME’s repository for images. Swedlow recalls when he and colleagues benefitted from OMERO. “I was trying to get a paper out, with revisions for the paper coming back after the postdoc had left,” he recalls. A reviewer queried the make-up of one of the figures. The fact that all the data were in OMERO and the figure had been made with OMERO.figure made it easy to check the data and show all had been done correctly, he says. “Without that, I wouldn’t have known how or where to find the original data that made up the figure.”

After David Grunwald, a colleague at UMass Medical School, asked Strambio-de-Castillia to join the 4D Nucleome Consortium, she, along with others, have led work to extend the Open Microscopy Environment (OME) data model to include the needs of the 4D Nucleome Initiative Imaging Standards Working Group, BioImaging North America (BINA) and QUAREP-LiMi.

The 4DN-BINA-OME specifications3, now in version 2 and soon to reach version 3, have been advanced to capture hardware specifications, image acquisition settings and quality-control metrics. She dreams of a searchable imaging “data commons” that also links data from within documents. Building such opportunities requires bringing everyone into standards discussions, she says.

A Next-Generation File Format, OME-NGFF, is in the works. What this isn’t, says Swedlow, is “The One File Format to Rule Them All.” What OME-NGFF enables, he says, is, for example, cloud-based data sharing, such as for public data repositories and collaborative data resources. It’s a format that “allows data streaming,” he says, as “data access à la Netflix.”

Back in 2005, says Strambio-de-Castillia, OME developed a model, a way to organize key-value pairs that reflect information and the relationship between two pieces of information. For example, a key may be ‘numerical aperture’ and a value might be ‘1.4.’ Bio-Formats, which the OME developed, is a software library with which labs can read and write image data in an open, standardized format. Some microscope manufacturers implemented Bio-Formats and, says Nitschke, all was well for those users.

Bio-Formats changes, however, says Nitschke, and so do proprietary formats of microscope manufacturers. Usually, systems are backward compatible and not forward compatible, so users of older software cannot open or write files in the new format. “You are always stuck with running after things,” he says. Microscopy has never had a universally readable file format. Radiology faces similar issues but the files, he says, have fewer parameters.

Microscopists also wrestle with the fact that different file formats might not save all the settings from an imaging experiment, says Nitschke. Image analysis can begin with one software package that lets a researcher read a file from a different microscope manufacturer. “Already at that step, it will not take all the information which is included in the original file,” he says. Next, data analysis with the second software package can again create a loss of information, and it becomes hard to keep track of image-processing steps. Bio-Formats promises a way to avoid this, “but they are always one or two steps behind,” says Nitschke. The types of experimental metadata that can go missing in this software journey, he says, may be the numerical aperture used or frame rate at which the image was captured, or the offset, which is the user-defined background of a given image.

QUAREP-LiMi addresses the fact that researchers need to repeat experiments reproducibly “at all levels of the hierarchy of the scientific world,” says Nikon Europe’s Kees van der Oord. Credit: R. van Maanen

Standards have begun to clear some of the file format jungle, and companies are on board with this. Van der Oord says that he and his colleagues know scientists want to share images and metadata between different image analysis platforms. The Nikon software NIS-Elements lets users save all images in the OME-TIFF format. The compatibility of NIS-Elements with OME-TIFF is one his company is committed to, he says, as “a continuous effort.”

Find the metadata

In addition to approaches such as the OME Data Model and the Bio-Formats image file format conversion library, Strambio-De-Castillia and others seek community-mandated standards for imaging data and specifications for metadata. Having the metadata helps scientists who are capturing images or time-lapse videos to reap more quantitative information from microscopy experiments at ever lower light levels. But, says Strambio-de-Castillia, the ways to record metadata well have not stayed abreast with technical advancements.

The ideal would be to not make scientists work harder on data collection and management or leave them on their own in doing so. Despite ever-more-sophisticated and complex instruments, she and her colleagues point out3, “practices to faithfully and reproducibly record quantitative image data and metadata have not kept up,” which has exacerbated the already existing challenges of quality control and reproducibility.

“This is true,” says Petra Haas, who leads a team at Leica Microsystems focused on applications in confocal microscopy. Especially within the constraints of imaging dynamic processes in vivo, state-of-the-art microscopes need to be designed to efficiently collect all available photons. The way to do this, she says, is by taking into account pixel dwell times and photon counts per pixel. “Having a photon counting detection scheme in place with a high dynamic range renders these types of experiments quantitative and reproducible,” she says.

Says Haas, with calibration information and benchmarks, she and her colleagues at Leica follow “with great interest” discussions around requirements to ease access to calibration information. In some labs, scientists modify their instruments, and it can be hard to collect and document everything that these modifications change about data capture. This can matter more for papers or projects that highlight a method as opposed to ones focused on a biological finding. Modifications tend not to be in line with instrument warranties or maintenance contracts but are usually tolerated in microscopy. They are a “source of understanding in which directions future developments could go,” says Haas.

Organizations that work on standards are of mutual interest to Leica Microsystems, say Petra Haas and Jürgen Reymann. Credit: Leica Microsystems

Data, and metadata in particular, says her Leica colleague Reymann, are the backbone of meaningful analysis and interpretation in the life sciences. They matter for reproducible research either with the same modality or for comparisons between imaging modalities. To work on solutions related to data analysis and handling, including data formats, storage, data analysis and more, he and his colleagues reach out to data experts beyond the world of light microscopy, he says.

“From a technological point of view, metadata in the sense of hardware specifications can sometimes only be generated with an extremely high amount of effort,” says Reymann. But he knows these data are essential to scientists for mapping workflow and process and for analyzing image data. It’s why he likes community initiatives that address this topic from different technological perspectives, “which we are following with the greatest interest.”

One central question is “the cost-benefit aspect of the effort involved in generating high-precision hardware parameters compared to the biological variability of the sample or sample preparation,” says Reymann. For targeted analyses, metadata must therefore be viewed “holistically” and included in the workflow from a scientific project’s start. A microscope needs to be a fully integrated, embedded module within the life science research cycle. “Commercial providers want and need sustainable, supported standards to be able to build on them for the longer-term,” he says.

Over the last 20 years, Zeiss systems have increased the amount of metadata that imaging systems automatically capture and store, says Brüne Venus. Credit: Zeiss

Imaging standards are key to professional, reproducible imaging and documentation because they enable serious, unbiased data analysis, says Brüne Venus, a developmental biologist and senior product manager with Zeiss. Among his tasks is to assess researchers’ needs, translate those into technical features for internal development teams, involve external scientists for new equipment testing, and support users and company sales teams in equipping a lab. Zeiss follows standards needs in biomedical and also industrial applications. Imaging is used, for example, to analyze steel and other materials.

In the last 20 years, he says, Zeiss systems have increased the amount of metadata that they automatically capture and store with the image data. The ZEN software supports open file formats, and scientists can use it to import, open and analyze these files. “The support for more file formats and metadata will grow in the future, along with applicational demands and necessity,” he says.

Metadata have specific importance in some applications, says Venus. Each individual metadata parameter has to be evaluated in light of how it supports a result. But, he says, metadata cannot just keep increasing because users will have to determine which of the many parameters matter in their particular experimental context. One risks, he says, missing the forest for the trees.

Metadata let scientists control quality and reproduce imaging conditions, which are shaped by the specific system setup and sample conditions. This is where biomedical applications and industrial applications differ dramatically. With materials, there were long, intense discussions about the “right” way to image samples and document and analyze data. After agreement between manufacturers and component suppliers on this “right way,” says Venus, it became possible to minimize deviations and tolerances in measurements. Zeiss offers support for GxP, a group of manufacturing guidelines and regulations for industrial imaging systems. Specific setups are GxP-qualified and error tolerances in imaging results have to be in a certain range. But biological samples in microscopy are a different story.

“We need standards,” says Roland Nitschke of the University of Freiburg, shown here with visiting microscopists-to-be. Credit: University of Freiburg, Life Imaging Center

“To achieve statistically valid results,” he says, and be able to reproduce those results in different locations and on different imaging systems, specific metadata “would most likely not do the job alone.” Also needed is information about environmental conditions, for example, and much else.

In fact, says Venus, when considering the metadata needed to reproduce results, a researcher would have to describe the influence that each metadata parameter has on the result. “From my own experience, I can comment that it is an open secret that the variability of the living sample has a much bigger influence on the result of an imaging experiment than imaging conditions themselves,” he says. This doesn’t, however, obviate the need for quality control of imaging systems, for example when a facility manager needs to know if a system is running within a manufacturer’s specification.

To support facilities such as Euro-Bioimaging, a cross-European research infrastructure in imaging, and other such projects large and small takes an understanding of the user community. The more heavily an imaging system is used, perhaps even 24/7, “the likelier it is that parts need to be exchanged because of wear,” says Venus. “Metadata, automatic component recognition, logging tools and remote service are options on different tiers to support the uptime of a system.”

Working together on quality control and metadata in QUAREP, for example, is “a valuable start to gather a common understanding for valuable information needed to reproduce scientific data.”

Undoubtedly, developing standards is a massive, time- and energy-consuming undertaking for many people. There have been naysayers, but “I’m still very optimistic,” says Nitschke. At times, he observes, scientists focus only on their side of issues, their ‘field of view’. He is glad that companies take an active part in the discussions and send multiple people to different QUAREP-LiMi working groups. He knows it costs the companies to participate in these efforts.

Discussions about standards have to stay open and interactive, he says. There’s always a back and forth, says Strambio-de-Castillia. Company scientists and engineers sometimes openly say: “That’s too far, we cannot follow you,“ says Nitschke. Plenty of times, he acts as a go-between between companies and academics. When the group considers a certain aspect to be a potential aspect to become part of a standard, he mentions how much companies would likely have to invest to fulfill this requirement. He poses the food-for-thought questions to the group, such as “Should we really ask for that?”