Science depends on widely shared behavioural norms of honesty and accountability. The scientific literature is a storehouse of insights concerning how nature works, wholly incomplete, of course, and subject to doubts and criticisms. It contains errors, misinterpretations and contradictions, and yet, overwhelmingly, reflects an honest collective effort to move humanity closer to something we hope may resemble the truth, insofar as that is possible.

“Science is a way,” as the American physicist Richard Feynman once put it, “of trying not to fool yourself,” or anyone else.

Not everyone shares this ideal. In recent decades, tobacco companies and the pharmaceutical and fossil fuel industries have often seen information and understanding issuing from science as a threat to their commercial interests. Their principal strategy to undermine science has been disinformation, with well-funded campaigns to spread uncertainty — over the true health risks from smoking, for example, or the evidence for global warming. They’ve infiltrated the scientific process by hiring researchers to publish papers that seem scientific on the surface, yet actually push industry propaganda.

What can scientists do about it? In a recent commentary in Nature Climate Change, Justin Farrell, Kathryn McConnell and Robert Brulle consider various possible countermeasures (Nat. Clim. Change 9, 191–195; 2019). Some research suggests an approach inspired by vaccination: exposing people to mild doses of disinformation on important issues can help to ‘inoculate’ the public, preparing their minds to resist unfounded arguments. More directly, recent legal cases against fossil fuel companies and chemicals’ manufacturers have hit companies financially, winning over juries by using the companies’ own internal communications to expose their duplicity.

Salvaging at least a tolerable purity for science will be an uphill battle. But companies ultimately need passable reputations, and remain vulnerable to the consequences of moral disgrace. In the past, they largely managed to keep their efforts out of view, but that may be changing.

I happened upon the consequences of scientific disinformation recently when researching an article on the biological effects of glyphosate, a chemical that is the active ingredient in a range of widely used agricultural herbicides. In 2015, the International Agency for Research on Cancer (IARC), a body of the World Health Organization, reviewed publicly available scientific studies on glyphosate and concluded that the compound is “probably” carcinogenic. This finding triggered huge controversy, and has ultimately led to several recent successful lawsuits against the agrochemical company Monsanto, the original patent holder of glyphosate. Monsanto is now owned by Bayer AG, a German pharmaceutical company.

When examining studies around this controversy, I came upon five papers published just after the IARC study, all highly critical of the methods the IARC used. Appearing as a supplement in the journal Critical Reviews in Toxicology, the papers argued that the IARC panel had been inappropriately selective in the studies they considered. As these studies noted, both the US Environmental Protection Agency and the European Food Safety Authority concluded that glyphosate does not cause cancer. They considered many studies carried out by industry and not publicly available, studies the IARC did not take into account.

These studies all seemed more polemical than a typical scientific paper. Even so, on my first reading, I took them at face value. Maybe, I wondered, the IARC methods really were problematic?

Only later did I discover that these five papers weren’t actually written by independent scientists interested in clarifying the debate, but by researchers who had been paid by Monsanto to push the view that glyphosate is safe. At least two of the ‘consultants’ listed as authors on these papers were paid directly by Monsanto for their work. E-mails released during litigation also revealed that at least one Monsanto employee had edited the summary manuscript, pushing hard to make the tone more critical of the IARC.

Biologist Pete Myers has been one of the leaders in exploring the biological effects of glyphosate and other commonly used compounds. These effects include disruption of hormone signalling channels even at very low doses, leading to serious consequences in development and reproduction, sometimes decades after the first exposure, or even only in subsequent generations of offspring. As Myers told me by e-mail, “Monsanto knew the IARC decision was coming and organized a massive pre-emptive counter attack.”

Unfortunately, the five papers in question still lurk in the literature, and it’s easy to read them without seeing notes the publisher added later to expose the authors’ obvious conflict of interest. The journal advised readers, when reading the articles, “to take this context into account,” but those who don’t see the notes won’t. As a result, Monsanto still benefits from its past efforts to poison the well of scientific information.

In their commentary, Farrell and colleagues suggest that broad public exposure of such efforts may be one of the most effective countermeasures. Disinformation works best when it seems to be real information coming from a variety of independent sources. Politically motivated corporations and think tanks have managed to achieve this by widely broadcasting coherent talking points, often through hired individuals with scientific credentials. Unfortunately, they’ve been winning the battle to keep this kind of coordination hidden.

In Monsanto’s case, the recent lawsuits have helped to change that. A flurry of similar lawsuits by cities and counties in the United States and the UK now target fossil fuel companies, using the companies’ internal documents to show how they deliberately downplayed climate problems they perfectly well understood. Lawsuits like this illustrate the long-term risks of disinformation strategies. Companies need investors, and scientists pushing back against industry propaganda can help those investors learn an important lesson — disinformation may help bring in profits in the short run, while leading to catastrophic losses later on.