In designing microscopy software to take advantage of better hardware, developers are facing challenges of accessibility, functionality and usability.
Last year Bridget Carragher, who runs the Automated Molecular Imaging Group at The Scripps Research Institute in La Jolla, California, USA, started gathering information on the latest electron microscopy software packages for structural biologists. At the time, she was writing an editorial for the Journal of Structural Biology and planned to accompany the paper with a web page providing a comprehensive list of tools.
But Carragher found the task of gathering more than 40 software applications and maintaining a long-term website somewhat overwhelming. It was clear that with the steady development of microscopy hardware and increasing computing power, software packages would be consistently improving and more applications would be added.
“Then it occurred to me that if we had a wiki, then people who own the stuff can update it,” Carragher recalls. The entry on Wikipedia, created last year, has been a tremendous way to gather software information, Carragher says. “I think it really is an excellent way of centralizing the resources. It is very easy to make quite a comprehensive list that is fresh,” she adds.
And electron microscopy software packages are only a small portion of the available and rapidly evolving software for microscopy applications. In particular, software platforms for acquiring, analyzing and organizing fluorescence microscopy data have been developed substantially in recent years, resulting in a daunting combination of different functions, according to Claire Brown, director of a core imaging facility at the McGill University Life Sciences Complex in Montreal. “In general, I find that there are just so many options to sort through. It's hard for the average researcher to know what options they have to begin with,” she says.
Brown says many existing software programs have a high learning curve for newcomers entering core facilities. But commercial and publicly funded developers are striving to make these software packages more intuitive.
The scope of integration
With advances in computer and microscope hardware, three-dimensional (3D) live-cell imaging has become faster and easier in recent years, says David Spector, a cell biologist at Cold Spring Harbor Laboratory (New York, USA) who uses several different microscopy systems to visualize the spatial and temporal dynamics of RNA in cells. “My students used to collect a 3D dataset of one cell overnight. They would come in in the morning and have one dataset,” he says. “Now with these microscope systems, the stage movements are so precise that we can visit a dozen cells in a particular experiment and collect these 3D data stacks overnight. So that's spectacular.” As a result, Spector's lab has evaluated far more parameters in their experiments and sped up the research in general, he says.
Software must constantly catch up to hardware to take advantage of faster and higher-resolution imaging. Companies are striving to integrate different imaging steps into single packages to meet the needs of most users. But they must consider a lot of hardware pieces, the different possible functions and the user interface.
Andor Technology, a company headquartered in Belfast, Northern Ireland, is focusing on developing faster image acquisition in their integrated packages. The iQ Software for live-cell confocal imaging is compatible with multiple hardware pieces from third parties, including the new Nikon Eclipse Ti microscope with Perfect Focus and total internal reflection fluorescence illuminator, according to Andor regional sales engineer Scott Phillips. One researcher Phillips worked with needed to capture more images during a live cell development experiment. For the scientist, it was taking five minutes to capture 3D images, but using the iQ Software that time interval got down to less than a minute. “It totally changed their impression of what was going on in the cells,” Phillips recalls.
Olympus America, Inc., headquartered in Center Valley, Pennsylvania, USA, also focused on faster image acquisition with their digital microscopy package, SlideBook 4.2, which they developed in partnership with Intelligent Imaging Innovations of Denver. According to Kimberly Wicklund, research imaging product manager at Olympus America, recent software development has focused on making integrated systems in which third-party hardware pieces run as efficiently as possible. “We try to address as many different hardware pieces as possible, and this can be challenging,” Wicklund adds.
SlideBook 4.2, which was released in February 2007, addresses a broad range of applications, from 3D deconvolution to ratiometric imaging. A particle-tracking interface allows users to alter features to track and to choose measurements of interest such as particle displacement or speed. And graphing can be done in real time, allowing users to change parameters of the experiment as it progresses. Wicklund says the next version of SlideBook will be released this year, taking advantage of the speed and memory handling of 64-bit processors and introducing automated colocalization tools along with a new scripting language.
Though companies are concentrating on integration, some offer several different packages catering to different research communities. Nikon's NIS-Elements C, for example, is an integrated confocal package launched in late January 2008 and is the latest of three other versions of NIS-Elements. “We don't want people who don't need the features to have to pay for them,” says Laura Sysko, software product manager at Nikon Instruments. “We put options in. There is a core batch of functions that you get.”
The number of software functions are increasing all the time, says Duncan McMillan, a product marketing manager at Carl Zeiss Microimaging in Thornwood, New York, USA. “At the same time, researchers expect the systems to be simpler to use,” he adds. Zeiss's newest version of its ZEN software for laser scanning microscopes incorporates biologists' feedback on usability. “One of the criticisms of the earlier version was that people would end up with an enormous number of windows open,” he says. Now users can customize their individual workspace to show only the controls they need and move around the interface controls on the computer screen. Although this software—which contains a new feature called “smart setup” that automatically configures experimental parameters such as exposure time and imaging resolution based on the user's choice of fluorophore—accompanies Zeiss's newest point-scanning, spectral confocal microscope system, the LSM 710, it is compatible with all Zeiss laser scanning microscopes.
Usability, as well as consistency of interfaces, has been a focus for Leica Microsystems, headquartered in Wetzlar, Germany. Software developers have created very similar platforms across different systems for widefield and confocal microscopy, according to Joanne Fallowfield, leader of product management in live-cell widefield microscopy at Leica Microsystems. “We thought the common platforms would be a smart move. Widefield or confocal users often are using the systems for the same types of experiments,” she says, adding that there is commonality in the way users want to access, arrange and analyze their data. Leica sought input from a company that specializes in user interfaces. “They came back to us with a bible of software-interface features that we stuck to,” she says. For example, Fallowfield says Leica designed their interfaces to mimic the workflow process at microscopes, and where the imaging steps were common between platforms, the same buttons are located in the same places at all parts of the process.
Many commercial developers agree that a single system cannot solve every biologist's problem, but they say their products cater to 80–90% of the market. Spector was in the remaining group. So when he needed an algorithm to track the dynamics of promyelocytic nuclear bodies as cells went through drastic morphological changes during mitosis, he called on a long-time collaborator, Roland Eils at the Center for Quantitative Analysis of Molecular and Cellular Biosystems at the University of Heidelberg in Germany, who specializes in computational biology. “Mathematical approaches are becoming more integrated in biology, which is happening at an amazing pace, allowing us to mine critical spatial and temporal information from our datasets,” Spector says.
For those without a colleague in computational biology, freeware developers are creating and improving existing plug-ins and programs. Brown says it is theoretically possible, though not easy, to perform all microscopy functions with freeware. Freeware developers are working on advancing automation functions and hardware compatibility in image acquisition and analysis programs and, in some cases, integrating several functions in the same program.
μManager, a free, open-source image acquisition program, was publicly launched last year and initially worked with about eight hardware pieces from Ron Vale's lab at the University of California, San Francisco where developers Nico Stuurman and Nenad Amodaj initiated the project. “We started with those hardware components we have in our own lab, but there are many different microscopes, different pieces of hardware,” Stuurman says. “Slowly over time we've been adding support for many of those.” Now, there are more than 50 devices—including cameras, microscopes, filter wheels, shutters and motorized stages—coded in the program, though this number is hard to track because end users have also started adding code, Stuurman says. The software has about 1,300 registered users and includes freeware package ImageJ from US National Institutes of Health (NIH) for analysis. Smaller hardware and camera companies especially are now adding support for μManager. “They really liked the idea,” Stuurman says. “Now they can make their hardware work in this software package.”
Established freeware, such as CellProfiler, Open Microscopy Environment and NIH ImageJ, an open-source Java-based image-processing program, have been improving compatibility and functionality. In June 2008, Open Microscopy Environment, a publicly funded, open-source project led by Jason Swedlow from the University of Dundee in Scotland and others came out with the Beta3 version of OMERO, a freely available program that helps researchers visualize, manage and annotate data about the conditions under which images were gathered. “Once researchers have metadata and the metadata model, they can start to build software to use those data and do useful things,” Swedlow says. The latest version incorporates support for larger images, complex annotations and tagging, and shared analyses. NIH ImageJ, which is developed and maintained by Wayne Rasband at the Research Services Branch of the National Institute of Mental Health, has more than 300 macros and 500 plug-ins available—a substantial increase from the 70 macros and 100 plug-ins available just four years ago.
The challenges ahead
Commercial and freeware developers are facing many challenges, including development of new image-analysis options and management of large volumes of data, sometimes reaching the level of terabytes. And live-cell and fluorescence imaging continue to be hotspots of software development, according to commercial software product managers.
But adding new functions to any software is not taken lightly by companies such as Leica, which has laid out a software architecture that they try to keep consistent. “I think that one of the most challenging things is to make very feature-rich software but keep the complexity of the interface down,” says Fallowfield. “One of the worst things you could do is add too many functions at the cost of increasing the complexity to the point where the customer is confused.” Sysko echoes these sentiments, adding that scientists using their products have vastly different needs. “The downside to that is that you get a really long list of what you put in your software,” she says.
Many developers are focusing on improving image analysis with larger datasets, Sysko says. “There's the hardware challenge of making sure we have the right memory, fast servers [and] data storage. Because of the sophistication of the devices that make all [these] data possible, on the back end we need the structure to manage, deal with it and make sense of it,” Sysko says. Recognizing patterns and structures using software will be an ongoing challenge and area of focus, Wicklund says. “Just with the expansion of computer speeds, we're able to do it more efficiently.”
And for the end users, image analysis still remains a challenge, especially at core facilities like Brown's that are trying to add more support for analysis. Users bring their data to the facility, where Brown teaches them one-on-one how to perform analyses. A lesson can last several hours. “We don't have enough staff to meet the demands. There's so much functionality in the programs,” she says. “Typically people just want the answers to their questions. See Table
About this article
Microscopy Today (2017)
HelioScan: A software framework for controlling in vivo microscopy setups with high hardware flexibility, functional diversity and extendibility
Journal of Neuroscience Methods (2013)