Please visit methagora to view and post comments on this article.

As it strives to maximize profits, industry devotes considerable effort to improving the efficiency of its processes. Henry Ford's assembly line revolutionized automobile manufacturing, and subsequent developments in workplace efficiency throughout industry provided some companies with substantial competitive advantages over less efficient rivals.

Although competition in academic research is alive and well, academics compete mostly on gross output—that is, on their discoveries. Analysis of how to reduce the input required is typically secondary or nonexistent. Obviously, turning all academic research into an efficient assembly line is neither possible nor desirable and would be inimical to some of its most cherished values, yet academics may have something to learn from industry in this regard.

Researchers often increase spending on personnel, kits and instruments in an effort to increase the scientific output from their laboratory, but this is difficult in the face of stagnant or decreasing research funding. Increased efficiency can help mitigate this resource problem.

One way of improving overall research efficiency is by analyzing and optimizing research methods and tools. Although published methods all presumably work at some level, many have not been fully optimized. A researcher developing a method only needs to optimize it sufficiently for his or her own use, and there is typically little incentive to go further. Although we strive to avoid this in the work we publish, many methods remain under-characterized and under-optimized.

Unfortunately, this situation can lead to gross inefficiencies when methods and tools are widely adopted. It is all too common for researchers to waste considerable effort trying without success to implement an under-developed method or to use the wrong tool—or the right one in the wrong way—owing to insufficient performance data.

Because commercial kit and instrument suppliers expend substantial effort optimizing the performance of their products, customers expect them to be reliable. But researchers must still decide between competing products, not to mention lower-cost noncommercial alternatives.

If scientists in large labs with the necessary resources devoted some of them to comparing or optimizing research methods and published the results for the rest of the community, research overall would be more efficient. Two examples of such efforts appear in this issue. On page 424, Thomson and colleagues describe an impressive pairwise analysis of the components of defined stem cell culture media. This allowed them to cut the number of components added to basal medium by more than half while increasing its reliability. The new medium is a simpler, lower-cost and better performing medium for stem cell researchers.

Often, performance data alone will guide researchers to better choices. On page 393, Drobizhev and colleagues describe the results of their efforts to characterize the two-photon excitation characteristics of fluorescent proteins. Before these data were available, researchers had little guidance on what fluorescent protein and excitation wavelength to choose for a particular two-photon imaging application. Now researchers can more easily avoid wasted effort and non-optimal choices.

But what about funding for this kind of work? Because publications analyzing methods performance are uncommon, it is difficult for funders to assess them and thus there is no ongoing commitment to these analyses. The US National Institutes of Health bravely supported the fluorescent protein analysis through their “Development of High Resolution Probes for Cellular Imaging” program, but similar examples are scarce.

One mechanism by which such studies are funded is through dedicated consortia that have method assessment or optimization as a goal. The MicroArray Quality Control consortium spearheaded by the US Food and Drug Administration assessed the reliability of microarray-based measurements. Similarly, the International Stem Cell Initiative is testing stem cell culture methods, and structural genomics consortia such as the Protein Structure Initiative are optimizing protein expression, purification and crystallization pipelines that other researchers can take advantage of.

Efforts to conduct comprehensive analyses are laudable but difficult and can cut into the time researchers have for their primary investigations. Research will always be grueling, and the scientific establishment should not be built on the backs of martyrs. But instead of relying on educated guesses, popularity or testimonials, high-quality, quantitative side-by-side analyses of tools—or performance comparisons between alternative methods—allow researchers to make more informed choices. This can and should result in high-impact papers and community accolades for the authors.

We intend to publish additional examples of such analyses from both individual labs and multilab collaborations. If we identify an area in need of such an analysis, we will attempt to recruit researchers in a suitable lab to conduct the study with a final goal of publication in Nature Methods through our established peer-review process. We encourage input and inquiries from the community in this endeavor.