Funding-agency policies mandating that scientific papers and data are made publicly available have helped to drive the adoption of preprints, open-access publishing and data repositories. But agencies often struggle to measure how closely grant recipients comply with the funding policies. Awardees, and the institutes that employ them, can struggle to ensure they are following the rules. Now, digital tools are cropping up to help both sides of the funding equation stick to the regulations.
The Bill & Melinda Gates Foundation in Seattle, Washington, has invested US$1.8 million to support the development of OA.Report, a tool that helps funders to track awardee compliance with foundation open-access policies. Developed by OA.Works in London, OA.Report uses text-mining techniques to match articles with the funder that supported the work, by sifting through academic papers and open-access metadata. The software also tracks article-processing charges, as well as the subsequent reports that summarize the outcomes of grants.
According to an unpublished analysis using OA.Report, of the 3,959 papers published last year that stem from research funded by the Bill & Melinda Gates Foundation, 95% are available to read for free online, 62% have data availability statements and 84% comply fully with the funder’s policy. The foundation’s open-access policy, introduced for all grants awarded from 2015 onwards, requires research papers stemming from the foundation’s funding, and their underlying source data, to be freely available for others to access and reuse.
Joe McArthur, director and cofounder of OA.Works, acknowledges that the OA.Report tool probably misses content when collecting data on what studies stem from a funder’s grants. However, he notes that OA.Report found 40% more articles that used funding from the Bill & Melinda Gates Foundation than the organization found before it started using the tool.
The foundation is happy with the current rate of compliance, says Ashley Farley, the programme officer for knowledge and research services at the Bill & Melinda Gates Foundation.
“It’s important to be able to have tools to track compliance,” Farley adds, saying that grant funders’ systems don’t usually track compliance effectively. “We have a strong ethos that if [research is] open access and openly licensed to reuse, that’ll further [its] impact.”
Funders pay an annual fee of between $5,000 and $20,000, depending on the size of their organization, to have OA.Works generate a report on researchers’ compliance of the funders policies, McArthur says. A free version of the tool will be available this year, but how it will differ from the paid version is yet to be determined, he adds.
Other users of OA.Report include the Robert Wood Johnson Foundation in Princeton, New Jersey; Aligning Science Across Parkinson’s in Washington DC; and the Templeton World Charity Foundation in Nassau, the Bahamas.
One of the tools OA.Report uses is DataSeer. Tim Vines, DataSeer’s director and founder, says that the organization, based in Vancouver, Canada, charges a fee to help journals run checks on the availability of data and code underlying each article they publish. “We’re helping the journals with a key component of peer review, which is compliance,” he says.
Last September, DataSeer partnered with the open-access publisher Public Library of Science (PLOS) to develop a range of open-science indicators. These indicators use artificial intelligence to document how well PLOS authors adhere to the journals’ policies surrounding sharing of code, data and preprints. Released in December, the first batch of data covers some 61,000 articles published between January 2019 and June 2022.
Taxpayers worldwide contribute hundreds of billions of dollars each year to support research that is meant to become publicly available, Vines says. “When that research does become public, it’s typically just the article and all of the other stuff that went into that article is never really available.” Reproducibility and trust in science is more likely to increase if the materials accompanying research papers are available for those attempting to repeat experiments.
In 2013, the US National Institutes of Health (NIH) introduced a policy that halts the future funding of researchers whose papers are not indexed on PubMed Central, a free digital repository that archives full-text articles that have been published in biomedical and life sciences journals.
To help researchers and institutions avoid such sanctions, a team at the University of Kentucky in Lexington have developed Academic Tracker. The free tool — described in a PLoS ONE paper1 in November 2022 — searches PubMed, Google Scholar, Crossref and ORCID databases to ensure, for instance, that all papers supported by the NIH have been submitted to PubMed Central. Hunter Moseley, a bioinformatician at the University of Kentucky who co-created the tool, says the software can be tweaked to meet the needs of different funding agencies.
“We believe Academic Tracker can significantly reduce the stress and hassle of reporting publications to federal funding agencies, reducing the chance for accidental non-compliance and resulting delay in funding,” the authors write.