Hundreds of scientists who post their peer-review activity on the website Publons say they’ve reviewed papers for journals termed ‘predatory’ — although they might not know it. An analysis of the site has found that it hosts at least 6,000 records of reviews for more than 1,000 predatory journals. The researchers who review most for these titles tend to be young, inexperienced and affiliated with institutions in low-income nations in Africa and the Middle East, according to the study, posted to the bioRxiv preprint server on 11 March.

The study is the largest yet to examine claims that scientists review for predatory journals. A popular conception of these journals is that they generally publish any manuscript they’re offered for a fee and don’t offer peer review. In fact, journals can be defined as predatory while providing peer review, because they might be deceptive in other ways. But the peer review that these journals conduct might not be to the standard most researchers recognize, says Matt Hodgkinson, head of research integrity at the publisher Hindawi in London. “They are likely going through the motions and using these reviewers as a fig leaf,” he says.

The reviews, if genuine, might be “a waste of valuable time and effort” by researchers, the study says. Its authors suggest that funders and research institutions should warn against reviewing for predatory titles.

Predatory or under-resourced?

Researchers use the Publons platform to list manuscript reviews they have conducted. But the organization, based in London, noticed several years ago that some users were claiming to review for potentially predatory journals. Staff there joined with researchers at the Swiss National Science Foundation (SNSF) in Bern and the University of Bern to conduct a large-scale analysis. Starting with a proprietary blacklist of titles deemed predatory by Cabells, a publishing analytics company in Beaumont, Texas, the team built algorithms to spot reviews for these titles on Publons. They chose not to identify any individual journal or reviewer in their analysis.

At least 10% of the journals on Cabells’ blacklist have reviews claimed on Publons, the study found. That doesn’t concern Cabells, says its technical director Lucas Toutloff, because the firm flags journals as predatory for many kinds of deceptive practices, such as misleading readers about editorial-board qualifications or physical office addresses, or having no policies to digitally preserve papers. “It might be that journals flagged as predatory are doing good-quality peer review but are flagged as predatory for other reasons,” says Anna Severin, a researcher at the SNSF and the University of Bern, who co-authored the study. It could also just be difficult to distinguish between predatory titles and journals that are legitimate but under-resourced.

Publons says it thinks the reviews are real records: scientists verify that they have done the reviews they list by forwarding e-mails of acknowledgment to Publons, or by having a journal editor independently verify the review.

Mixed motives

Although some scientists might not realize the journals they are reviewing for are predatory, it’s possible that others see the chance to expand the list of journals they review for as a way to showcase their academic productivity — no matter the title. And Publons itself could be “unwittingly gamifying peer review”, Hodgkinson suggests. The site has leaderboards for top reviewers — which might further reward indiscriminate reviewing. A spokesperson for Publons notes that the site provides guidance on spotting robust and ethical journals, and that in July 2019, it changed its leaderboards so that their order is, by default, sorted by journals indexed in the Web of Science database.

Nature spoke to some scientists who list reviews on Publons for journals that are listed as predatory on Cabells’ list. Ian Burgess, an entomologist who is the director of Insect Research and Development, a contract research company in Cambridge, UK, said he had reviewed eight manuscripts for four predatory journals from a publisher called Academic Journals, based in Nigeria. Burgess said he hoped to provide guidance for authors, but that the publisher completely ignored his comments. “Clearly I was naively deluded in thinking if you had proper reviews the quality of publications would rise,” says Burgess, who says he won’t do any more reviews for those journals, although he’s still being asked to. (The publisher didn’t respond to Nature’s request for comment.)

And a researcher in Germany who didn’t want to be identified told Nature that he had reviewed for a predatory journal, but that his suggestions were also ignored. He pointed out that sometimes his review suggestions had been disregarded in established journals as well.

The study authors are now planning to examine the text of reviews in both predatory and legitimate journals — in some cases, Publons can access the text of private review reports — to see whether there are any measurable differences in quality, Severin says.