CAREER FEATURE

Coronavirus misinformation, and how scientists can help to fight it

Bogus remedies, myths and fake news about COVID-19 can cost lives. Here’s how some scientists are fighting back.
Nic Fleming is a science writer based in Bristol, UK.

Search for this author in:

A poster warning against the spread of 'fake news' online

As the coronavirus has spread around the world, so has misinformation.Credit: Nguyen Huy Kham/Reuters

Eating sea lettuce or injecting disinfectant won’t prevent you from getting COVID-19. Holding your breath for ten seconds is not a test for SARS-CoV-2. And Russian President Vladimir Putin did not release 500 lions in Moscow to persuade the city’s residents to stay indoors as part of the efforts to fight the pandemic. The rapid global spread of COVID-19 has been accompanied by what the World Health Organization has described as a “massive infodemic”. Huge demand for information on the disease, its toll on health-care systems and lives and the many unanswered questions about a virus that was discovered only in December have created the perfect breeding ground for myths, fake news and conspiracy theories (see ‘Eight ways to spot misinformation’). Some can be dismissed as ludicrous and largely harmless, but others are life-threatening.

Scientists are well placed to help to hold back the tide of COVID-19 misinformation — but should they get involved in time-consuming, and sometimes bruising, efforts to do so, or just stick to doing good research? For those who sign up for the fight, how can coronavirus untruths best be confronted? Should scientists restrict interventions to their areas of expertise? Is countering falsehoods about the pandemic purely a public service, or might there be career benefits to doing so?

“I think scientists need to get out there on the front line, if they are comfortable doing so,” says Jevin West, a data scientist at the University of Washington in Seattle. “By countering misinformation about COVID-19, they can help policymakers avoid introducing harmful policies, improve public understanding of the pandemic and, most importantly, save lives.”

One of the many changes that has been brought about by COVID-19 is a widespread increase in news consumption. One survey of 13 countries was carried out in March by market-research company GlobalWebIndex. It found that, as a result of the pandemic, 67% of those surveyed are watching more news coverage, and that half of that subset are spending significantly more time doing so. “We’re refreshing our web browsers constantly, looking for good news or inside information about COVID-19 because it affects our health, and that of our friends and families,” says West. “That makes us more vulnerable to being fooled.”

West co-created Calling Bullshit, a course on how to spot and counter false appeals to scientific and statistical evidence, and, in December, co-founded and became director of his university’s new Center for an Informed Public, whose core aims include researching rumours and misinformation during crises. It’s been a busy few months for West and his colleagues. “It’s been all-consuming, a bit like trying to build a boat while you’re floating along in the sea,” he says.

The misinformation world

False medical claims are a key focus for those seeking to minimize potential harms. Researchers at the Taiwan FactCheck Center have, for example, spent a large proportion of their time debunking reports about fake remedies and tests since late January. Examples include claims that smelling sesame and other plant oils, breathing in steam or cleaning the nostrils with salty water can kill SARS-CoV-2 before it reaches the lungs. “The popularity of false information about remedies and ways to kill the virus is, in my opinion, a response to the knowledge gaps around COVID-19,” says Summer Chen. She is chief editor at the centre, which was set up in 2018 by two non-profit organizations in Taiwan, Taiwan Media Watch and the Association for Quality Journalism.

A man wears a 'COVID-19 is a hoax' mask

What role should scientists play in combatting these baseless claims?Credit: Jeff Gritchen/MediaNews Group/Orange County Register/Getty

Some who share myths are simply misguided, but others are driven by profit. In March, the US Food and Drug Administration warned companies and individuals, including Alex Jones, owner of the fake-news website Infowars, and televangelist Jim Bakker, to stop touting the benefits of unproven COVID-19 treatments such as colloidal silver. Both these people have promoted and sold products containing tiny particles of silver suspended in liquid for use against COVID-19, despite there being no sound evidence that they work. Another way to profit from fake news is advertising revenue. “About half of the disinformation we see is about people trying to produce viral content to get clicks to direct others to a website full of Google ads,” says Giovanni Zagni, director of Facta, a new Italian fact-checking website. Zagni says the site has focused about 90% of its content on COVID-19 since its launch on 2 April.

Many COVID-19 myths seem to be politically motivated, such as the reports that SARS-CoV-2 either escaped from the Wuhan Institute of Virology in China or was a bioweapon created deliberately in the country. A survey of US residents conducted in mid-March found that 6% thought the virus was accidentally created in a laboratory, and 23% that it was developed intentionally. Chen and her colleagues have attempted to debunk such reports, as well as those suggesting the virus was brought to China by people from the United States.

Scientists might have more impact when confronting myths that are less political. “If it’s something crazy, like the virus is a bioweapon created by Barack Obama, I think scientists are better off leaving that to others and spending their time in the world of science,” says West. One way for scientists to help is to make their expertise available to journalists and fact-checkers who are debunking misinformation.

But should scientists attempt to counter misinformation across fields, or stick to their own? The debate over whether researchers should ‘stay in their lane’ has, at times, become heated during the COVID-19 pandemic.

In March, the UK-based Science Media Centre, which provides journalists with comments and briefings from scientists, asked its network of experts to stick to their disciplines when responding to media queries about COVID-19. “Especially during a pandemic, when there’s a sea of misinformation, uncertainty and rumours circulating, the public needs to hear from scientists with deep expertise who really know what they’re talking about,” says chief executive Fiona Fox. Others, such as West, disagree. “We should encourage, not discourage, scientists to ‘step outside their lane’, especially during a worldwide crisis,” he says. “As long as they are transparent about their expertise, there is much to gain from more scientists thinking about the problem with their different methodological perspectives and experience.”

Friendly fire

The tone of interventions can determine how they are received. In March, British singer and television personality Kerry Katona shared an Instagram post claiming that children with COVID-19 would be separated from their parents and taken to hospital alone. British doctor and television presenter Ranjit Singh responded: “Not true! Facts are facts! I’ve seen lots of confusion & misinformation about kids & #coronavirus recently,” and posted a summary of the correct information from the UK Royal College of Paediatrics & Child Health. Katona thanked him and said she felt reassured. Zagni says that avoiding appearing confrontational or patronizing is key when seeking to change minds. “Try to have an honest discussion with people about why they chose to share material, but don’t talk down to people or treat them as stupid. Most people are engaged in the same search for truth.”

Subtle reminders about accuracy that avoid direct confrontation might prove effective. In a study currently awaiting peer review, psychologist Gordon Pennycook, at the University of Regina in Canada, showed two groups of people from the United States a series of news headlines about COVID-19. Half of the headlines were true and half were false; the participants were not told which was which. On average in the first group, 47% of the accurate headlines and 43% of the inaccurate ones were considered worth sharing. The second group was asked to rate the accuracy of a single headline unrelated to COVID-19 before performing the same task. This seemed to make them more discerning, because they went on to say they would consider sharing 50% of the true reports and 40% of the untrue ones.

A protester holds a banner claiming that the COVID-19 vaccine will kill millions

Some misinformation is relatively harmless, but other ‘facts’ can be deadly.Credit: Scott Barbour/EPA-EFE/Shutterstock

Many of those who have been inspired to use their training and experience as scientists to protect people from false information about COVID-19 simply want to contribute to reducing the loss of life and health. There could, however, be other benefits to getting involved in the defence of scientific truth. “Sharing your work and expertise, and engaging with the public, is an important part of being a scientist now,” says Samantha Vanderslott, a health sociologist at the University of Oxford, UK. “Calling out fake stories can raise your profile.”

Zagni cites the example of Roberto Burioni, a doctor and microbiologist at the San Raffaele University in Milan, Italy, who made a name for himself by standing up to the anti-vaccination campaigners. “If, as a scientist interested in communication, you establish yourself as a well-respected voice that fights against bad information, you can build a career based on that,” says Zagni.

Although becoming known as a defender of science might have career benefits, there could be downsides. “It can help professionally, especially for early-career researchers, but it can also take up days and weeks, and it can be hard to return to your research once you get sucked in,” says West. Overall, he argues that researchers shouldn’t allow professional considerations to get in the way when deciding whether to help in the battle against COVID-19 misinformation. “Ultimately, it really shouldn’t matter because lives, and trust in science, are at stake and we need to do something about it.”

Eight ways to spot misinformation

Source suspicion. Vague, untraceable sources, such as ‘a doctor friend of a friend’ or ‘scientists say’ without further details, should ring alarm bells.

Bad language. Most trustworthy sources are regular communicators, so poor spelling, grammar or punctuation are grounds for suspicion.

Emotional contagion. If something makes you angry or overjoyed, be on your guard. Miscreants know that messages that trigger strong emotions get shared the most.

News gold or fool’s gold? Genuine scoops are rare. If information is reported by only one source, beware — especially if it suggests that something is being hidden from you.

False accounting. Use of fake social-media accounts, such as @BBCNewsTonight, is a classic trick. Look out for misleading images and bogus web addresses, too.

Oversharing. If someone urges you to share their sensational news, they might just want a share of the resulting advertising revenue.

Follow the money. Think about who stands to gain from you believing extraordinary claims.

Fact-check check. Go past the headlines and read a story to the end. If it sounds dubious, search fact-checking websites to see whether it has already been debunked.

Nature 583, 155-156 (2020)

doi: 10.1038/d41586-020-01834-3

Updates & Corrections

  • Clarification 24 June 2020: In ‘Eight ways to spot misinformation’, this article gave an arbitrary example of ‘Dutch scientists’ to imply an unnamed source, but as this example could be misconstrued as casting aspersions the article has been updated to simply say ‘Unnamed scientists’.

Nature Briefing

An essential round-up of science news, opinion and analysis, delivered to your inbox every weekday.