Washington

The US National Institutes of Health (NIH) faces a tough balancing act between funding large institutes and supporting individual laboratories in the still-emerging discipline of proteomics, said scientists at a workshop on the topic.

Twenty-four senior researchers were invited to the NIH campus in Bethesda, Maryland, late last month to tell its managers how best to invest in proteomics, which seeks to establish the structure and function of all human proteins. But the limitless scope of such an enterprise clearly perplexed some of the participants, many of whom voiced suspicion of any large-scale approach akin to the Human Genome Project.

In theory, proteomics could encompass the study of all known human proteins, complexes of proteins, and interactions of proteins with small molecules. But unravelling such a vast array of information in a systematic way would require new and expensive technologies. And although some researchers at the workshop argued that the NIH should invest in large centres that would develop and exploit such technologies, most said that they favour approaches that would encourage individual investigators to study small parts of large, complex protein networks.

Several delegates voiced concern that a high-throughput approach to finding proteins would be hugely expensive but yield little insight. John Yates, a mass-spectrometry specialist at the Scripps Research Institute in La Jolla, California, dismissed some such efforts as “low-IQ proteomics”.

Other researchers argued that without uniform standards, such as the ones that have been adopted for gene microarrays, even existing data on proteins will be difficult to replicate between laboratories. Still others said that large data sets will prove useless without individual investigators who are motivated to wade through them to gain new insights into their favourite systems.

“We need a benevolent champion for individual molecules,” said Steve Clarke, a biochemist at the University of California, Los Angeles (UCLA). “We have tremendous amounts of information already and so much of it goes unused.”

But others declared support for a 'big-science' approach to the subject. Ruedi Aebersold, a co-founder of the Institute for Systems Biology in Seattle, Washington, joined Peer Bork of the European Molecular Biology Laboratory, Heidelberg, in advocating large-scale protein profiling. They said that the NIH should set up pilot centres stocked with mass spectrometers that would profile the proteins expressed in the bodies of healthy and sick patients.

Other participants questioned whether individual researchers will be able to use the data generated by high-throughput technologies. “How are we going to connect all these data sets to the laboratories of the people asking the questions?” asked William Studier, a structural-genomics expert at the Brookhaven National Laboratory in New York state.

David Eisenberg, director of the US Department of Energy's Laboratory of Structural Biology and Molecular Medicine at UCLA, said: “I think there was consensus that it's probably not going to be possible to have a scaled-up human proteomics project in the next couple of years.”

NIH managers attending the workshop included Francis Collins, director of the National Human Genome Research Institute, who said he would use its guidance to help frame his institute's next five-year plan, which is due out next spring.