A renewed decree on research funding last year by the government of Flanders in northern Belgium advised that 40% of research evaluation should be based on bibliometric data. This involves a complex calculation that includes the number of publications and citations, and the impact factors for the journals of publication (see go.nature.com/mt9srg; in Dutch). We question the merits of this strategy, given the debatable value of impact factors in gauging research quality (see, for example, B. Alberts Science 340, 787; 2013).
The Flemish Interuniversity Council in fact advised Flemish universities in 2010 to reduce the importance of bibliometric data when weighing up researchers for appointment or promotion. In practice, however, such data have become even more important.
It is time to heed the widely recognized risk of overrating the quantity of scientific publications, which can compromise research quality and integrity.
Author information
Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Stroobants, K., Godecharle, S. & Brouwers, S. Flanders overrates impact factors. Nature 500, 29 (2013). https://doi.org/10.1038/500029b
Published:
Issue Date:
Further reading
-
Advancing science or advancing careers? Researchers’ opinions on success indicators
PLOS ONE (2021)
-
Ten-Year Publication Trajectories of Health Services Research Career Development Award Recipients
Evaluation & the Health Professions (2016)
Comments
By submitting a comment you agree to abide by our Terms and Community Guidelines. If you find something abusive or that does not comply with our terms or guidelines please flag it as inappropriate.