There are events of the year, persons of the year, images of the year.... We could not resist: why not a Method of the Year?
Methods are a driving force of scientific progress. We think they should be celebrated as such. So the editors set out, a couple of months ago, to select the most notable method of 2007—not a method just off the inventor's bench but rather one that came into its own in 2007 and had a wide-ranging impact. But our discussion was quick: we soon had a clear winner in next-generation sequencing.
It is actually not one single method but several, developed in parallel. The first two next-generation sequencing methods made their official entry on the scene in 2005, with publications by the groups of Jonathan Rothberg and George Church, and two related platforms became available within two years. A feature on page 11 recounts the development of these methods and some of the events in 2007 that contributed to firmly establish them in the community.
The two concepts at the core of next-generation sequencing, which permitted its dramatic increase in throughput over the traditional Sanger sequencing, are DNA amplification without bacterial cloning and DNA sequencing without chain termination. First, each fragment of DNA is PCR-amplified independently, in a way that keeps the amplification products spatially clustered. Then, Sanger's terminator technique is replaced by sequencing either by synthesis or by ligation. (As a reference for the nonspecialist, a Primer on page 15 gives a glance at the inner workings of the main sequencing techniques.)
What impressed us when thinking about next-generation sequencing as the Method of the Year was the broad range of its applications and the fact that it has already inspired novel uses beyond its original purpose. Stephan Schuster, on page 16, illustrates this broad impact while focusing on applications directly related to genome sequencing, the original drive for these methods' development. He shows how—more than just extending the capacity of DNA sequencing—the new technologies have enabled projects that were unthinkable a few years ago.
Barbara Wold and Rick Myers on page 19 describe how the new technical capacity led scientists to rethink assays that were not directly related to genome sequencing. Indeed, applications have sprung up in which short DNA sequences are not the object of research per se, but are simply tags for identification and counting. Any assay that can be adapted with a DNA readout can now be thought of in terms of massively parallel interrogation. In many ways, next-generation sequencing has started supplanting microarrays.
Remarkably, the development of the technologies occurred largely within the realm of commercial organizations. A notable exception is the technique from the laboratory of George Church, who is now pursuing his platform development in an open-source model aiming to distribute 'at cost' an instrument that uses off-the-shelf reagents. Although it is likely that commercial drive and competitive environment have accelerated the development and dissemination of the commercial techniques, these come with a steep price tag. According to Schuster (p.16), the set-up cost, combined with skepticism about fundamentally changing experimental ways, led to an initially slow response from investigators and funding agencies.
Whereas the 2007 achievements have likely convinced the skeptics, the price tag does remain an issue for many. Though the new technologies offer a dramatic cost reduction compared to Sanger sequencing, some of the new possibilities they have unleashed, such as individual genome sequencing and studying functional regulation of gene expression, call for more cost cutting. The other challenge is that scientists have only started to understand these methods' performance and limitations. Tools are needed to overcome specific limitations such as short read lengths, uneven confidence throughout reads and poor performance with homopolymers.
If the choice of next-generation sequencing as Method of the Year was uncontroversial among our team, we did have other ideas and enthusiastic discussions. To share that excitement, we included a shortlist of Methods to Watch. It is an incomplete and subjective selection, established by Nature Methods with the input of other editors at Nature, Nature Reviews and Nature Research journals. Some of these Methods to Watch are, thanks to recent developments, on the cusp of turning around fields of research. Others, by contrast, do not yet have a technical solution but rather represent areas in which methodological developments are sorely needed.
We welcome your comments on our choices as well as your suggestions of other methods to keep an eye on. (To share your thoughts please visit methagora .) We firmly intend this event to become an end-of-the-year tradition, and we hope for your participation in next year's nominations!
About this article
Cite this article
Method of the Year. Nat Methods 5, 1 (2008). https://doi.org/10.1038/nmeth1153
Live-cell imaging reveals the spatiotemporal organization of endogenous RNA polymerase II phosphorylation at a single gene
Nature Communications (2021)
BMC Bioinformatics (2016)
Molecular Biology Reports (2013)
A cost-effective and universal strategy for complete prokaryotic genomic sequencing proposed by computer simulation
BMC Research Notes (2012)
De novo characterization of the antler tip of Chinese Sika deer transcriptome and analysis of gene expression related to rapid growth
Molecular and Cellular Biochemistry (2012)