Nature and the Nature journals are diversifying their presentation of performance indicators.
Metrics are intrinsically reductive and, as such, can be dangerous. Relying on them as a yardstick of performance, rather than as a pointer to underlying achievements and challenges, usually leads to pathological behaviour. The journal impact factor is just such a metric.
During a talk just over a decade ago, its co-creator, Eugene Garfield, compared his invention to nuclear energy. “I expected it to be used constructively while recognizing that in the wrong hands it might be abused,” he said. “It did not occur to me that ‘impact’ would one day become so controversial.”
As readers of Nature probably know, journal impact factors measure the average number of citations, per published article, for papers published over a two-year period. Journals do not calculate their impact factor directly — it is calculated and published by Thomson Reuters.
Publishers have long celebrated strong impact factors. It is, after all, one of the measures of their output’s significance — as far as it goes.
But the impact factor is crude and also misleading. It effectively undervalues papers in disciplines that are slow-burning or have lower characteristic citation rates. Being an arithmetic mean, it gives disproportionate significance to a few very highly cited papers, and it falsely implies that papers with only a few citations are relatively unimportant.
These shortcomings are well known, but that has not prevented scientists, funders and universities from overly relying on impact factors, or publishers (Nature’s included, in the past) from excessively promoting them. As a result, researchers use the impact factor to help them decide which journals to submit to — to an extent that is undermining good science. The resulting pressures and disappointments are nothing but demoralizing, and in badly run labs can encourage sloppy research that, for example, fails to test assumptions thoroughly or to take all the data into account before submitting big claims.
The most pernicious aspect of this culture, as Nature has pointed out in the past, has been a practice of using journal impact factors as a basis for assessment of individual researchers’ achievements. For example, when compiling a shortlist from several hundred job applicants, how easy it is to rule out anyone without a high-impact-factor journal in their CV.
How to militate against such a metrics-obsessed culture?
First, an approach that some have applied in the past and whose time has surely come. Applicants for any job, promotion or funding should be asked to include a short summary of what they consider their achievements to be, rather than just to list their publications. This may sound simplistic, but some who have tried it find that it properly focuses attention on the candidate rather than on journals.
Second, journals need to be more diverse in how they display their performance. Accordingly, Nature has updated its online journal metrics page to include an array of additional bibliometric data.
As a part of this update, for Nature, the Nature journals and Scientific Reports, we have calculated the two-year median — the median number of citations that articles published in 2013 and 2014 received in 2015. The median is not subject to distortion by outliers. (The two-year median is lower than the two-year impact factor: 24, down from 38, for Nature, for example.) For details, see go.nature.com/2arq7om.
Providing these extra metrics will not address the problem mentioned above of the diversity in citation characteristics between disciplines. Nor will it make much of a dent in impact-factor obsessions. But we hope that it will at least provide a better means of assessing our output, and put the impact factor in a better perspective.
However, whether you are assessing journals or researchers, nothing beats reading the papers and forming your own opinion.
Related links
Related links
Related links in Nature Research
Beat it, impact factor! Publishing elite turns against controversial metric 2016-Jul-08
We need a measured approach to metrics 2015-Jul-08
Related external links
Rights and permissions
About this article
Cite this article
Time to remodel the journal impact factor. Nature 535, 466 (2016). https://doi.org/10.1038/535466a
Published:
Issue Date:
DOI: https://doi.org/10.1038/535466a
This article is cited by
-
Faculty appointment and promotion in Taiwan’s medical schools, a systematic analysis
BMC Medical Education (2022)
-
Using Kano diagrams to display the most cited article types, affiliated countries, authors and MeSH terms on spinal surgery in recent 12 years
European Journal of Medical Research (2021)
-
DDS: A Decade of Distinguished Scholarship
Digestive Diseases and Sciences (2021)
-
Health policy and systems research publications in Latin America warrant the launching of a new specialised regional journal
Health Research Policy and Systems (2020)
-
The democratization of scientific publishing
BMC Medicine (2019)