Two studies have unveiled widespread flaws in the reporting of animal experiments — the latest in a series of papers to criticize shoddy biomedical research.

Whereas reports of clinical trials in major medical journals routinely state how many patients die or drop out of analysis during the course of a study, animal studies generally fail to report this figure — or drop animals without saying why, according to a team led by Ulrich Dirnagl at the Charité Medical University in Berlin. That lapse could significantly bias results, the team reports in the journal PLoS Biology1.

In a second study in the same journal2, a team led by John Ioannidis, an epidemiologist at Stanford University in California who has repeatedly called for more reproducible and transparent research, criticizes the lack of data availability and detailed protocols in biomedical papers.

Missing mice

Dirnagl’s team reviewed 100 reports published between 2000 and 2013 describing 522 experiments that used rodents to test cancer and stroke treatments, and compared the numbers of animals reported in the papers’ methods and results sections. Some two-thirds of the experiments did not state whether they had dropped any animals from their final analysis. Of those that did report numbers, around 30% (53 experiments) reported that they had dropped rodents from their study analysis, but only 14 explained why.

The researchers used computer simulations to show that the levels of attrition they saw might seriously affect the results of the studies. If biomedical scientists were biased in how they dropped animals — excluding outliers that gave extreme data values, for instance — then results would be fourfold more likely to find a statistically significant result that was in fact just due to chance, and could overstate the actual effectiveness of treatments by as much as 175%, the team says.

Ioannidis’s team, meanwhile, examined a random sample of PubMed articles published from 2000 to 2014. They found that none of the 268 biomedical papers made its full data available, and all but one lacked details needed for other researchers to replicate the work. In 2000, more than 90% of examined papers lacked conflict-of-interest statements, compared with about one-third in 2014.

Animal studies under fire

“I have to say I am distressed but not surprised,” says Malcolm Macleod, a stroke researcher and trial-design expert at the University of Edinburgh, UK. “These important findings are further evidence of the challenges we face in improving the quality of biomedical research.”

Both papers reinforce earlier studies that have criticized poorly designed and reported animal experiments. A study led by Macleod last year3, for example, looked at over 2,500 preclinical research papers and found that many of the studies described were poorly designed; the majority did not report using recommended methods to avoid bias.

Many research journals have endorsed voluntary reporting guidelines for animal studies, but adoption is spotty.

The PLoS Biology papers come as a part of a special section on 'meta-research': research about how research is done. The number of meta-research studies is increasing, says Marcus Munafò, a biological psychologist at the University of Bristol, UK, who writes about ways to improve scientific rigour. “These studies illustrate what the scope of the problem is,” he says, adding that they should be welcomed as an opportunity to boost scientific quality.