A renewed decree on research funding last year by the government of Flanders in northern Belgium advised that 40% of research evaluation should be based on bibliometric data. This involves a complex calculation that includes the number of publications and citations, and the impact factors for the journals of publication (see go.nature.com/mt9srg; in Dutch). We question the merits of this strategy, given the debatable value of impact factors in gauging research quality (see, for example, B. Alberts Science 340, 787; 2013).
The Flemish Interuniversity Council in fact advised Flemish universities in 2010 to reduce the importance of bibliometric data when weighing up researchers for appointment or promotion. In practice, however, such data have become even more important.
It is time to heed the widely recognized risk of overrating the quantity of scientific publications, which can compromise research quality and integrity.
About this article
Cite this article
Stroobants, K., Godecharle, S. & Brouwers, S. Flanders overrates impact factors. Nature 500, 29 (2013). https://doi.org/10.1038/500029b
PLOS ONE (2021)
Evaluation & the Health Professions (2016)