An American flag above the US Capitol building in Washington, D.C.

The US Government Accountability Office has recommended ways for three federal science funders to improve scientific reproducibility.Credit: Ting Shen/Bloomberg via Getty

The three largest basic-research funders in the United States are under the spotlight as the federal government aims to make the science they fund more reproducible. Two of the agencies — the US National Institutes of Health (NIH) and the US National Science Foundation (NSF) — are firming up plans to foster more rigour and transparency in their processes; NASA, meanwhile, has drawn criticism for being less proactive.

The move by the NIH and NSF follows a July 2022 report by the US Government Accountability Office (GAO) on the reliability of federally funded research, and how the NIH, NSF and NASA assess the work that they fund. It found that, although the scientific community has developed many practices for collecting, sharing and storing research data, the three agencies do not widely promote these approaches in their assessments.

According to the report, the agencies “do not collect indicators of rigorous study design and transparency of research results such as study sample size, adherence to research plans, or the extent to which research data are findable, accessible, and usable”. This means that they don’t have the information needed to improve how grants are awarded or to identify which funding areas should be prioritized.

The report makes six recommendations — two for each agency — for how the funders can bring more rigour and transparency to the work they fund. The proposals focus on collecting information on rigour — the “soundness and precision of study design, execution, data collection, and analysis” — and on transparency, ensuring that this information is documented and shared freely. The NIH and NSF accept the recommendations, but NASA disagrees with one recommendation and only partially agrees with the second.

“For me, transparency and open research are a critical part of how we can drive quality,” says Marcus Munafò, a biological psychologist at the University of Bristol, UK. He leads the UK Reproducibility Network, a consortium that aims to investigate the factors that contribute to robust research. It also provides training and disseminates best practice in this area. “Funder mandates go some way to achieving this, but they need to be enforced to really drive change,” he says.

Funders react

In e-mail responses to questions from Nature Index, the NIH and NSF say they are drawing up plans for new processes in light of the GAO report. A NASA spokesperson instead referred Nature Index to the report, which reproduces a June 2022 letter that lists the agency’s responses to the recommendations.

The letter is from NASA deputy chief scientist David Draper to Candice Wright, director of the GAO’s Science, Technology Assessment, and Analytics team. In it, Draper argued that NASA does not need to collect information from researchers on scientific rigour. “NASA believes that the best way to ensure research reliability is the peer review process, which has long been the gold standard for scientific credibility,” Draper wrote. “Accordingly, NASA relies on the peer review process in the scientific community to assess research rigour, quality, transparency, and relevance of scientific proposals submitted to NASA, as well as the scientific journal publications arising from NASA-funded research.”

Wright, who authored the GAO report, says peer review of grant proposals is not sufficient to weed out irreproducible or irreplicable science. She says the GAO will continue to engage with NASA about the benefits of implementing the report’s recommendations.

Richard de Grijs, an astronomer at Macquarie University in Sydney, Australia, agrees. He says that, in his view, NASA hasn’t adequately responded to the fact that the peer-review process isn’t a cast-iron guarantee against subpar science being published. “It might be useful to implement an additional check,” he says.

As for the report’s second recommendation for NASA, which advises examining and, if necessary, revising policies on data transparency, Draper wrote that NASA only partially agrees. That’s because the agency is already revising its scientific integrity policy and adopting open-science practices more widely, he noted. “NASA’s Science Mission Directorate has initiated its Year of Open Science, and the goals of that effort include, but are not limited to, enhancing ready access to NASA-funded research publications by applying the FAIR rubric: Findability, Accessibility, Interoperability, and Reuse,” he wrote.

Agency actions

The NIH, meanwhile, says it is considering “several different activities” in response to the GAO report, according to a spokesperson for the NIH Office of Extramural Research. Some are already being rolled out: a policy that went into effect on 25 January, for example, requires researchers to plan how their data will be preserved and shared. When asking grant applicants to describe their strategies for ensuring rigorous experimental design for robust and unbiased results in their studies, the NIH tells candidates that they should strive to ensure that experimental design, methodology, analysis, interpretation and reporting of results are all robust and unbiased, the spokesperson says.

An NSF spokesperson notes that the funder is “committed to enhancing the reliability of research it funds”, and agrees with the GAO’s recommendations. The NSF is “currently formulating our plans for actions that respond to those recommendations”, they add.

The GAO plans to check in with the agencies at least once a year to assess progress, Wright says, and will give the funders up to four years to roll out any changes in full. The GAO will also keep an eye on related legislation being passed by the US Congress; for now, it will examine only whether the legislation is in line with the report’s recommendations.

Munafò thinks a proportion of every funder’s budget should be set aside for this ‘research on research’. “We need to better understand the factors that influence research quality, and in particular the impacts — both intended and unintended — of funder policies and mandates,” says Munafò, who thinks it is right that funders have been challenged to respond to these issues. “This is happening, to varying degrees, across countries and funders, but should be an integral part of how funding agencies work.”