A torrent of low-quality meta-analyses and systematic reviews in biomedicine might be hiding valuable research and misleading scientists.
A gold standard of scientific analysis is fast becoming tarnished, according to a report by a leading meta-researcher.
Systematic reviews and meta-analyses distil scientific articles on similar questions into what is meant to be an authoritative take on a particular topic — often how well a particular treatment works across medical settings — and they are key tools in evidence-based medicine.
But valuable reports are getting diluted by “a massive production of unnecessary, misleading and conflicted systematic reviews and meta-analyses”, according to John Ioannidis at Stanford University in California, who has published a report in The Milbank Quarterly1 looking at trends in the publication of these articles.
He decided to try to quantify the problem after noticing “an epidemic” of poor and obviously flawed articles, he says. “When the most influential literature is wrong, the harm that is done is worse than when unimportant studies are wrong.”
Ioannidis counted the number of articles that had been tagged as ‘systematic reviews’ and ‘meta-analyses’ in PubMed, a database of biomedical and life sciences publications. From 1991 to 2014, the numbers of these articles published annually increased by more than 2,600%, to 28,959 for systematic reviews and 9,135 for meta-analyses. Over the same time period, the total number of articles appearing in PubMed each year increased by 153%.
The most prolific source of meta-analyses — 63% of the total in 2014 — were genetic association studies from authors in China, many of which did not account for the high likelihood of finding false positives, Ioannidis concluded.
One reason that systematic reviews are increasing is that more people around the world are doing research and are eager to get publications, says Christopher Schmid, a biostatistician at Brown University School of Public Health in Providence, Rhode Island. Schmid, who handles meta-analyses at the American Journal of Kidney Diseases, says that around ten years ago he began noticing that many more submissions were coming from Asia. “At first they were not good quality, but they have really improved a lot.” Still, he says, access to data is likely to expand faster than education about what it takes to do a good systematic review.
Ioannidis also thinks that much of the overall increase stems from articles intended mainly to increase citations and publications — or to serve as marketing tools for industry groups. One topic — the use of drugs called statins to prevent a common heart arrhythmia after heart surgery — had been covered by 21 meta-analyses in seven years; another 185 had been written about antidepressants in a similar period, and about one-third of these had co-authors employed by a drug manufacturer.
The prestige of such publications in biomedicine contributes to the problem, says Ioannidis. They generally run in respectable journals, he says, and are also cited more than any other type of study design.
Another study published this year2 examined characteristics of a random sample of 300 systematic reviews and found that problems were common; few sought out unpublished data or incorporated sources of bias into the analysis. Many failed to report how they identified or selected the articles to include.
Hopefully, such analyses will bring attention to a serious problem, says Kay Dickersin, director of the Center for Clinical Trials and Evidence Synthesis at Johns Hopkins University in Baltimore, Maryland. There are good reasons for these types of articles to increase, she says. People appreciate that such studies are important in making community health decisions, and more instruction is available to teach scientists to do them well.
But demand is still greater than the amount of expertise out there, says Dickersin, who is part of an effort to install trained experts as journal editors. “We just have to acknowledge that specialty journals can’t find enough methodologists to vet reviews,” she says. “It is terrible and scary, since systematic reviews and meta-analyses are considered the highest level of evidence.”
Ioannidis, J. P. A. Milbank Q. 94, 485–514 (2016).
Page, M. J. et al. PLoS Med. 13, e1002028 (2016).
Read a previous Trend Watch: 'Brazil ratification pushes Paris climate deal one step closer'
Related links in Nature Research
Related external links
About this article
Cite this article
Baker, M. Mass production of review articles is cause for concern. Nature (2016). https://doi.org/10.1038/nature.2016.20617