Nature | News

Open-access index delists thousands of journals

Many publications did not reapply after leading directory tightened its quality criteria.

Article tools

Rights & Permissions

A leading index of open-access journals is set to shrink by more than one-quarter after delisting around 3,300 titles as part of an effort to exclude questionable and inactive publishers.

The Directory of Open Access Journals (DOAJ), which at the beginning of the year listed more than 11,000 open-access academic journals, announced two years ago that it would be tightening its standards for inclusion. It asked every journal in its index to provide more details about their operations so it could ensure that they adhere to basic publishing standards.

Data: Heather Morrison, Dramatic Growth of Open Access, V11/DOAJ

The crackdown came after the index was criticized for including ‘predatory publishers’ — journals that profess to publish articles openly, often after charging fees, but that are either outright scams or do not provide expected services such as a minimal standard of peer review or archiving.

The DOAJ is now striking off journals that did not submit the necessary paperwork in time, says Lars Bjørnshauge, the directory’s managing director. He says that he is “absolutely sure” that the majority of the journals that did not reapply are not publications with poor ethics; rather, he thinks, they are small outfits that are unfamiliar with providing the information required for reapplication. Some 6,700 journals have reapplied, and Bjørnshauge thinks that most of them will pass.

Many of the journals that failed to reapply claimed to be based in the United States. But after investigating publishers’ claims about where they are based, the DOAJ suspects that some in fact operate from other countries, says Dominic Mitchell, community manager for the directory.

Improving trust

The tightened policy is a “huge improvement”, says Walt Crawford, a retired library-systems analyst in Livermore, California, who is currently analysing DOAJ-listed journals. “I believe DOAJ will be a reasonably trustworthy source of journals that at least intend to do honest, serious open access.” If anything, he says, he worries that legitimate journals may be omitted.

But Jeffrey Beall, a librarian at the University of Colorado, Denver, who assembled a blacklist of open-access journals that he terms "potential, possible or probable" predatory publishers, fears that the index still contains weak publications. A problem with ‘whitelists’ such as the DOAJ’s, he says, is that they rely on data supplied by publishers, which might exaggerate or misstate information to make their journals look more attractive.

Bjørnshauge agrees that there are publishers that cheat authors; the DOAJ has supported an educational campaign and plans to hire almost a dozen ‘ambassadors’ to identify questionable publishers and to promote good publishing practices across the developing world, he says. Assessing journals on a case-by-case basis should be straightforward, says Crawford. “My guess is that most scholars can figure out whether a journal is an appropriate outlet with 10–15 minutes work.”

The number of open-access journals is soaring: the DOAJ gets about 80 applications every week, Bjørnshauge says, and over the past 2 years it has rejected about 5,400 applications, most of them new applicants. Lack of information about licensing rights, publication permissions and editorial transparency are all important issues in rejecting publications, he says.

Whitelists such as the DOAJ and blacklists like Beall’s are not the only tools to make sense of open-access journals. Some crowdsourced efforts — such as Journalysis.org — ask academics to review journals. The most recent of these, a website called QOAM (Quality Open Access Market) asks volunteers with academic credentials to evaluate a journal’s policies and to describe their experience if they published with the journal — and then sorts journal titles into one of four categories, including “threat to authors”. 

“We go beyond what the DOAJ does to allow the stakeholders to rank the journals,” says Jelte Wicherts, who studies publication bias and data sharing at Tilburg University in the Netherlands and who helped to develop the QOAM scoring system. He thinks that there are reliable proxies — notably the transparency of a peer review system — that can be used to gauge the quality of a journal1.

Such crowdsourced systems can fall prey to extreme minority views, and to being gamed by pro-journal reviewers, unless many disinterested authors can be persuaded to take part, points out Bjørnshauge. Beall, too, is sceptical that a crowdsourced effort can take off. But it should be given a chance to succeed, he says. “It's great that there’s experimentation going on.”

Journal name:
Nature
DOI:
doi:10.1038/nature.2016.19871

References

  1. Wicherts, J. M. PLoS ONE 11, e0147913 (2016).

For the best commenting experience, please login or register as a user and agree to our Community Guidelines. You will be re-directed back to this page where you will see comments updating in real-time and have the ability to recommend comments to other users.

Comments for this thread are now closed.

Comments

1 comment Subscribe to comments

  1. Avatar for Charlie Rapple
    Charlie Rapple
    I think it's useful to add mention here of "Think. Check. Submit." - a toolkit that includes a checklist for researchers to help them evaluate a journal. It gives some pointers for that 10-15 minutes work that Walt mentions, and I think is particularly helpful for researchers with limited experience of publishing processes who might struggle to know what sort of things to check without guidance of this nature. http://thinkchecksubmit.org
sign up to Nature briefing

What matters in science — and why — free in your inbox every weekday.

Sign up

Listen

new-pod-red

Nature Podcast

Our award-winning show features highlights from the week's edition of Nature, interviews with the people behind the science, and in-depth commentary and analysis from journalists around the world.