Some researchers love to hate the journal impact factor, because it can be used inappropriately to judge the influence of articles. As part of a push for better article metrics, researchers at the US National Institutes of Health (NIH) have developed the Relative Citation Ratio (RCR), which they describe in a paper posted on the preprint website bioRxiv1. The RCR normalizes citation numbers across disciplines, so that articles from biomedical fields with different citation rates can be compared. The NIH says that it is using the metric to assess the influence of work that it has funded. But some scientists debated on social media how accurately the metric reflects an article’s impact.
Researchers have developed a variety of impact measures that focus on an individual researcher or paper. The RCR is calculated by dividing the number of citations a paper received by the average number of citations an article usually receives in that field. That number is then benchmarked against the median RCR for all NIH-funded papers. This allows articles to be assessed on the basis of their relevance in their own field, and highly influential articles will be recognized even if they are published in an obscure journal, says George Santangelo, director of the NIH Office of Portfolio Analysis in Bethesda, Maryland, and a co-author of the RCR paper. “We want to identify influential papers wherever they might be published,” he says. “We can provide a measure of how each paper has influenced the field.”
According to Santangelo, the NIH is now using the metric as one of many tools for strategic planning. For example, the RCR can identify influential areas as well as ones that are important but need more funding to have a greater impact. Santangelo stresses that the algorithm is still at an early stage and needs improvement. He says that the RCR will not replace experts when it comes to making decisions about funding. “Data-driven methods need to be tempered by subject-matter expertise,” he says. “The bottom line is that human administrators will continue to make decisions.”
Stefano Bertuzzi, executive director of the American Society for Cell Biology in Bethesda, praised the new metric in a blog post, calling it “stunning”. Bertuzzi applauds the NIH for moving away from the journal impact factor (JIF). He wrote that the metric “evaluates science by putting discoveries into a meaningful context. I believe that the RCR is a road out of the JIF swamp.”
But Ludo Waltman, a bibliometrics researcher at Leiden University in the Netherlands, criticized the metric in a blog post that was shared by many researchers on Twitter. He wrote that “the RCR metric doesn’t live up to expectations,” adding that it is complex and not very transparent. In an interview, Waltman said that this might stop researchers adopting it. “In general, metrics that are relatively straightforward, simple and transparent will be easily accepted by researchers,” he says.
Another limitation is that the RCR has been applied only to papers in the biomedical field, which excludes other disciplines from making use of the metric, says Phil Davis, a publishing consultant in Ithaca, New York, who analyses citation and publication data. This could also skew the scores for certain papers. The RCR calculation includes the number of citations received by papers that have been cited by the article in question. If that article cites ones from fields outside of biomedicine, that could lead to a lower RCR score, meaning that interdisciplinary articles could be unfairly penalized, says Davis.
In an interview, Santangelo said that the RCR does accurately account for interdisciplinary citations within biomedicine, adding that the metric was designed for biomedical papers because that is the NIH’s focus. He would be happy for other researchers to improve the algorithm so that they can use it in their own fields, he says. “We don't suggest [the RCR] is the final answer to measuring. This is just a tool. No one metric is ever going to be used in isolation by the NIH.”
Bertuzzi said in an interview that the RCR is a good start. “Maybe it won't work, but we're starting a conversation,” he says. “We're seeing different ways of looking at impact.”
- Journal name: