Image of a blood stem cell dividing in real time Credit: Reya lab, Duke University

Stem cells are defined by their remarkable ability to self-renew and differentiate into specialized cells. But even after careful sorting, a single population of stem cells is dynamic: some divide rapidly and others more slowly; some differentiate, others self-renew; some can give rise to more lineages than others. Because of this variation, population studies of stem cells are unable to accurately address essential questions, such as defining discrete steps from a single stem cell to a complex population of cells.

“There are very few people who pay attention to the advantages and importance of studying single cells,” says Ron McKay, chief of the Laboratory of Molecular Biology at the National Institute of Neurological Disorders and Stroke in Bethesda, Maryland. “They talk as if they do. They use a FACS machine and act as if they have single-cell data. But they don't. They have data on a population, and that's a completely different thing.”

Although single-cell analysis is still too new to have generated a wealth of literature about the lineages a single stem cell can follow, the genes it can express and the way it behaves, voices from many different fields are emphasizing its importance.

Taking a look

“Each stem cell needs to be treated as an individual because they're not necessarily in the same state at every point in time. You have to look at them as individuals.” Gajus Worthington, Fluidigm Corp.

Imaging individual cells is one of the most sought-after achievements in cell biology, but it is perhaps the most challenging. To image single stem cells, researchers use time-lapse photography to take pictures at a high resolution every 2–3 minutes, often for days at a time. Not only must the cells' movements be constantly tracked to keep them in frame, the thousands of resulting images must be processed, manually scrutinized and statistically analyzed.

At the University of Waterloo in Ontario, Canada, chemical engineer Eric Jervis generates lineage trees of many generations of stem cells and makes movies of individual cells differentiating or self-renewing. But the research requires robotic microscopes, canyons of hard-drive space and custom software. Even with the help of talented students, the equipment design took a year to complete1. “It's technically challenging. It requires a renaissance-type researcher to be able to span robotic imaging, database and data mining and stem cell biology. If you don't have all three, you can't do anything,” says Jervis.

Despite his willingness to share the details of his system, Jervis is more likely to receive cell samples in the mail for imaging than he is to have other labs wanting to adopt the technique, he says. At the long-term live-cell imaging core facility for the Canadian Stem Cell Network, Jervis now receives samples each week from labs across Canada in need of his tools.

In the United States, even with the support of the National Institutes of Health infrastructure, it took McKay more than six years to construct his stem cell imaging system: building a live-cell chamber, constructing a microscope and camera to track and image the cells, and developing software to handle the data. “We spent an enormous amount of time figuring out how to do these experiments. But in the end, I think it was worth it,” says McKay. Using the system, he recently published an analysis of how central nervous system stem cells transition from one state to another, including the cells' response to ciliary neurotrophic growth factor, a cytokine thought to direct neural stem cells toward an astrocytic fate2.

It was also worth it for Tannishtha Reya, who studies stem cell fate and cancer at Duke University Medical Center in Durham, North Carolina. “Imaging is just so powerful, I can't get enough of it,” she laughs. It took Reya more than a year to develop an imaging system able to track individual haematopoietic stem cells (HSCs); her greatest challenge was getting the cells to stay put long enough to photograph them. (The solution? A layer of stromal cells to slow their movement.) Once the apparatus was completed, Reya used it to compile videos of symmetric and asymmetric divisions in HSCs, showing for the first time that HSCs undergo both types of division and that the balance between the two can be influenced by microenvironmental cues and subverted by oncogenes3.

For now, single-cell imaging capabilities remain sequestered in labs lucky enough to have the know-how, stamina and funding to develop and maintain them. “Technically, it's extremely hard to do,” says McKay. For the technology to spread, he says, the process will need to be simplified. Fortunately, another single-cell technology has proven more portable and attainable — it can be bought from a local sales representative.

Tracking expression

The quantitative real-time polymerase chain reaction (qRT-PCR) remains the experiment of choice for analyzing cellular gene expression. But for single cells or small populations of cells, the extensive DNA amplification needed in conventional RT-PCR techniques often introduces noise and bias. In addition, genes with abundant transcripts are more likely to be detected, and typically, only a few genes can be analyzed at a time.

Enter the microfluidics chip.

In 1999, Stephen Quake and colleagues at Stanford University in California founded the company Fluidigm, based in San Francisco, to commercialize the integrated fluidic circuits technology developed in his laboratory4. Often referred to as a 'lab on a chip', the bottle cap–sized devices covered in silicon circuits require only minute amounts of samples and reagents to carry out a variety of experiments, including the measurement of gene expression. One of Fluidigm's products, a microfluidics chip developed by the company's chief scientific officer Mark Unger and his team, has an extensive miniature plumbing system in which up to 96 samples from a single cell can be tested against 96 genes, giving 9,126 individual qPCR reactions in 75 minutes. “The volume requirements are so low that we've done as many as 1,000 genes off an individual cell,” says Gajus Worthington, cofounder and chief executive of the company. “Most researchers using the tool are typically looking at between 50 and 100 genes,” he notes, citing such scientists as Irving Weissman at Stanford, Shinya Yamanaka at Kyoto University in Japan and Toshio Suda at Keio University in Japan.

Single-cell analysis, including imaging and gene expression, can reveal information population studies cannot Credit: Fluidigm Corporation

Suda, who admired the technology while visiting a colleague's lab at Stanford in 2006, uses the Fluidigm chip to examine subpopulations of HSCs, which, although highly purified, can vary widely in function and gene expression. He recently used the chip to analyze key genes involved in the expression of the cell-adhesion molecule N-cadherin in HSCs, a controversial area of research, and his results are currently in press.

Mylene Yao, who studies early mammalian embryos at the Stanford School of Medicine and who is a past consultant for Fluidigm, heard of the chip from colleagues. With collaborators, her team used it to investigate the role of Oct4, a pluripotency regulator in embryonic stem cells, and found it to be a critical regulator of the early embryo's gene network5. Because of the time and cost involved in isolating enough two-cell mouse embryos for conventional RT-PCR, “we wouldn't even have attempted what we did”, says Yao. “The chip allowed us to validate a large number of genes in a very short time with a limited amount of biological materials.” During the review process for their paper, reviewers expressed curiosity about the new technology rather than scepticism or resistance, Yao recalls, which could be attributed to its extensive use in other Stanford labs and the quality of the data, which is highly reproducible.

But for all its positives, the chip is still restricted to gene-expression studies and cannot do all the work for you. “Microfluidics allows you to multiplex an experiment and make it more efficient,” says Peter Zandstra, chair of stem cell bioengineering at the University of Toronto, who doesn't currently use the technology, “but you still need to know what genes to look for.”

Such technologies might be on the horizon. In the upcoming issue of Nature Methods, Applied Biosystems and collaborators, including M. Azim Surani of the University of Cambridge, describe a technique for high-throughput whole-transcriptome analysis of single cells. Using a single mouse blastomere, researchers were able to detect the expression of 75% more genes than they were able to using microarray techniques, and they were able to pinpoint 1,753 previously unknown splice junctions6. The protocol, which uses current off-the-shelf reagents, will be made available to customers shortly after publication, says Kaiqin Lao, principal scientist in Applied Biosystems' molecular cell biology unit and coauthor of the paper.

The technology is an interesting and valuable step forward, says Zandstra. However, he notes, “we still do not know, of all the genes and splice variants that this technology will output, which are functionally predictive of a particular stem cell state.” It is possible, he adds, that by comparing results from stem cells with those from closely related non-stem cells, one might pinpoint the gene candidates worth further investigation.

The next players

Beyond cell imaging and gene expression, other applications of single stem cell analysis are hardly thick on the ground. No one carrying out proteomic or metabolomic studies on individual stem cells could be located for this article. Such research is typically done on whole organisms, although recent studies in 'micrometabolomics' and 'microproteomics' suggest that technologies are advancing in these fields7,8. But for now, applying proteomics to a single cell is “totally impossible”, says Howard Gutstein, a proteomics researcher at the University of Texas M.D. Anderson Cancer Center in Houston. Although current techniques do allow researchers to tag known proteins inside a cell, it is not possible to analyze a whole proteome or to determine differences in protein expression between individual cells, he says.

Fluidigm continues to develop chips with single-cell applications and, with a grant from California's Institute of Regenerative Medicine, is currently developing a cell-culture chip for automated differentiation of stem cells (or, in the case of induced pluripotent stem cells, dedifferentiation). The research is based on previous work in Quake's lab, and Unger predicts that the chip will be available in two years9.

“The challenge of developing a new technology that will be widely accepted for single-cell stem cell measurements is quite high”, says Zandstra, especially for validation studies to prove that a candidate is indeed a stem cell. For that, functional assays remain the gold standard, such as the generation of a tetraploid mouse in the case of suspected mouse embryonic stem cells or long-term reconstitution of a mouse haematopoietic system for potential HSCs. “Ideally, what we'd like to do is ask what markers are associated with that cell so we don't have to do this ridiculous assay,” he says.

A major concern

In spite of their limited application so far, technologies for examining single stem cells are generating a great deal of enthusiasm, and proponents point out that the technologies are more than just exciting, they are absolutely essential to the field. One cannot determine what pathways are directing cell specification without looking at a single cell, says McKay. “If you have a population of cells, you're lost...If you can't actually look at the given cell and ask what state is it in, then you can't conclude that you know when fate has been specified.” Worthington agrees: “Each stem cell needs to be treated as an individual because they're not necessarily in the same state at every point in time. You have to look at them as individuals.”

What's more, single-cell analysis may be a prerequisite in some circumstances. For therapeutic applications, stem cells will have to undergo detailed in vitro characterization before grafting. “Say I have a therapy where I need 109 cells,” says Jervis. “As an engineer, I need to manufacture 109 perfect things, because if one of those cells is transformed and is going to give rise to a tumour, I now have a serious problem.” He pauses. “It is a single-cell problem. I think this is one aspect with which the field hasn't come to grips. Single-cell analysis is going to be where it's at.”

Related articles

Chance and the single cell

Timm Schroeder: Stem cell tracking

Peeking through bone to blood formation