The battle about the long-awaited reform of Italian research policy1 is taking place behind closed doors, and the scientific community is in the dark. There is no indication that there will be a transparent system of peer review2 in the near future. Indeed, any discussion of reform will continue to be abstract and ineffective so long as there are no objective indicators of scientific performance.
In the past 15 years, panels and committees have been set up to evaluate the scientific performance of government-funded research organizations, but neither detailed reports nor action have followed. Even the recent working group of the Ministry of Universities and Research3 has provided no more than a laudable declaration of intent and a comprehensive catalogue of scientific centres, research activities and numbers of employees.
The 71 pages of graphs and comments provide no data on cash flow compared to scientific production per research centre and scientific project, or basic information on how, where and with what results money is spent.
This is unfortunate; a fast, reasonably objective and cost-effective way of measuring the scientific performance of research institutions is to use bibliometric indicators4. Statistics such as the number of publications and their relative impact factors can show quite surprising results. For instance, in a pilot study5 we have scrutinized the publication output of four main categories of scientific institutions in the field of environmental research: universities, CNR (National Research Council), ENEA (National Agency for New Technologies, Energy and the Environment), and all the other smaller organizations.
From the results reported in Table 1, Alison Abbott's claim that ENEA should “have less to fear” from scientific reform is disputable1. In fact, we are still waiting to see the “thousand flowers” bloom as was claimed by ENEA's president, Nicola Cabibbo — the quality and productivity of ENEA research in the environmental field are among the lowest to be found in Italian science.
The Ministry of Universities and Research could easily extend this study to other scientific fields and compare the research performances of different scientific institutions and research fields both within Italy and perhaps with those of other European countries. We are aware that bibliometric indicators are not exempt from criticism6,7 and that they should be used with care in conjunction with strategic, cultural and political considerations. But it cannot be denied that they can bring objective and valuable information to bear on the waste of resources within the system.
Yet, after years of discussions, the bogeyman of peer review based on bibliometric indicators still comes up against ingrained fears and strenuous resistance, because of its potential to reveal structural inefficiencies and, consequently, to trigger disturbing conflicts and bring the discussion under the spotlight of public attention.
Abbott, A. Nature 388, 609–610 (1997).
Abbott, A. Nature 383, 567 567 (1996).
Berlinguer, L. Linee per il riordino del sistema nazionale della ricerca scientifica e tecnologica. Università e Ricerca 3 (1997).
Gatto, M., De Leo, G., Menozzi, P. & Paris, G. Fifteen years of environmental research in Italy: an analysis of international publications produced between 1981-1995. Proc. S. It. Ecol. 8th Congress (1997).
Miflin, B. Nature 390, 12 12 (1997).
Motta, G. Nature 376, 720 720 (1995).
Metcalfe, N. B. Nature 376, 720 720 (1995).
Institute for Scientific Information (ISI) Topical Citation Report 1981-95 (ISI, Philadelphia, PA, 1996).