A survey of British institutions reveals that few have taken concrete steps to stop the much-criticized misuse of research metrics in the evaluation of academics’ work. The results offer an early insight into global efforts to clamp down on such practices.
More than three-quarters of the 96 research organizations that responded to the survey said they did not have a research-metrics policy, according to data presented at a London meeting on metrics on 8 February. The same number — 75 — had not signed up to the Declaration on Research Assessment (DORA), an international concord that aims to eliminate the misuse of research metrics, which was developed in San Francisco in December 2012.
“It was disappointing to learn that so many institutions have no metrics policy at all, but I think the survey will come as a wake-up call,” says Stephen Curry, a structural biologist at Imperial College London who chairs the DORA steering group. On 7 February, Curry announced that the group had received funding and support that would allow it to start practical work beyond mere campaigning.
Leading the change
DORA calls for panels responsible for academic promotion and hiring to stop misusing metrics such as the journal impact factor — which measures the average number of citations accumulated by papers in a given journal over two years — as a way to assess individual researchers. It urges panels to assess the content of papers and quality of research instead.
But that call is not necessarily being heard, the survey showed. “The Dean of my school is metric and spreadsheet crazy,” read one anonymous response to the survey, which was conducted by the Forum for Responsible Research Metrics, a partnership supported by five UK research agencies that organized the London meeting. The survey found 52 institutions had implemented some measures to promote responsible-metrics principles, but only four had taken what the forum considers to be comprehensive action.
Worldwide, some 450 organizations — including universities, funders and journals — and 12,000 individuals have now signed DORA. On 7 February, seven UK research-funding agencies that together disburse about £3 billion (US$4.1 billion) a year announced that they, too, had signed up. But only 16 UK universities have signed the declaration, says Curry.
Institutions that have not signed may nevertheless be committed to change, says Elizabeth Gadd, a research-policy manager at Loughborough University, which the forum found has taken comprehensive action. She says that Loughborough considered signing DORA, but that staff felt that the declaration didn’t go far enough. Instead, the university built on a different set of guidelines for using metrics responsibly, called the Leiden Manifesto, which was published in 2015. DORA “is too focused on the misuse of journal metrics in general and the journal impact factor in particular”, says Gadd. “There are many other ways in which metrics can be used irresponsibly.”
Several policies attempt to recognize how publication and citation practices vary across disciplines. “Social scientists do different kinds of work even in the same field and use different kinds of citations,” says Penny Andrews, a research fellow in politics at the University of Leeds.
Loughborough’s policy, says Gadd, allows academic departments to choose the publication indicators they feel are most appropriate. And Glasgow University’s policy states that it will use metrics that are normalized by subject and career stage. Its policy states that it will ask job candidates to describe their best papers and their contributions to them, so interviewers don’t have to rely on proxy indicators.
Although the survey focused on the United Kingdom, researchers around the world are concerned about the abuse of metrics. Curry says he hopes that the survey will prompt renewed interested in DORA and that this will ripple out to other countries. “There are still far too many researchers around the world who have not heard of it,” he says — and some institutes in nations including China reward researchers with financial bonuses if they publish in high-impact-factor journals. “With our new resources we have already started to think about developing and supporting ‘nodes’ in South America, Africa, and Asia,” Curry says.
Many delegates at the meeting say they’d still like to see certain metrics used alongside academic judgment. “I can’t go with the fact that there are not responsible metrics,” says David Sweeney, executive-chair-designate of Research England, which will oversee the block funding of research in English universities from April. He told the meeting that he thinks each institution needs to find a system that works for them.