skip to main content
caption: Bill Gates, Co-Chair the Bill & Melinda Gates Foundation shows a vaccine during a press conference in 2011.
As the Covid-19 pandemic spread in 2020, conspiracy theories about vaccines pursued by the Gates Foundation began to spread online.
Enlarge Icon
Bill Gates, Co-Chair the Bill & Melinda Gates Foundation shows a vaccine during a press conference in 2011. As the Covid-19 pandemic spread in 2020, conspiracy theories about vaccines pursued by the Gates Foundation began to spread online.
Credit: Flickr Photo/UN Geneva (CC BY NC ND 2.0)/https://flic.kr/p/9Jn7Rj

True or false: Can you spot the infodemic in our midst?

Take a look at these three stories. True or false?

  • Scientists have not only tracked the pandemic strain of coronavirus to Wuhan, China, but specifically a lab in that city. Further evidence suggests that SARS-CoV-2, the novel coronavirus that causes Covid-19, is a product of human intervention. Authorities are looking into whether the virus broke out of the lab in Wuhan.
  • Bill Gates is using his coronavirus vaccine to place microchips into people. And that effort is really part of a larger scheme -- how the Devil aims to mark the population with the number of the beast, as explained in the Book of Revelation.
  • After residents in Russia did not take social distancing orders seriously enough, the Russian government released lions onto the streets of more heavily populated cities, which prompted people to stay inside.

Answers, in order: False; false, and false.

That is, according to the fact checkers at Snopes who have found themselves very busy since the coronavirus pandemic spread across the globe.

"When there is something that is new, there is a lack of understanding about it, and that creates a massive vacuum for all sorts of misinformation online," said Alex Kasprak, a senior writer with the fact-checking website Snopes.

Perhaps the above examples of false information seem extreme or easy to spot. But they are actual stories that have successfully advanced across the internet. Countless others are more difficult to spot -- disguised and packaged in ways to appear genuine.

Snopes is not alone. The Associated Press' weekly "Not Real News" roundup has largely been filled with pandemic news that never really happened, but was spread nonetheless. For example:

Pop culture vs pandemic: Interest for epidemic movies spikes

It has all added up to "an infodemic" alongside the pandemic -- a disease of misinformation. It infects friends, family, and that one guy your old college roommate knows, who seems well-informed (at least online) -- they all spread it through your social media feed or in your email inbox. Perhaps you have spread it, too.

The goals of such misinformation vary from peddling products, to pranks, to widening division among communities.

To discern what is true and what is not, a person would need to put in the time to research and to read. But really, who has time for all that on Twitter? Send in the memes!

Busy time for fact checkers

newspaper fire infodemic
Enlarge Icon

Snopes was among the first fact-checking websites to emerge in the 1990s as the internet became woven into daily life -- and with it, a lot of misinformation and rumors. As the coronavirus pandemic evolved over the past few months, its writers (such as Kasprak who specializes in science news) have noticed a dramatic uptick in bad info.

“It’s been a really mind bending increase in both claims that people want to debunk and viewership on our website as well," Kasprak said.

To put that in perspective:

  • Snopes received about 8,500 requests for fact checks in March 2019.
  • Amid the pandemic in March 2020, Snopes received 23,595 requests; nearly three times as much.

Kasprak says his fact-checking work amid the pandemic has spiked. He has spent hours and hours chronicling how people are spreading misinformation online -- even if it comes from a Word document that someone uploaded to social media and called it a “study” (something he actually has observed).

"Falsehood flies and truth comes limping after it"

Kasprak is not the only person noticing the pandemic has offered a fertile environment for misinformation, conspiracy theories, and even more bizarre claims to thrive. Carl Bergstrom is a University of Washington professor who has become known for his course "Calling BS." The class addresses topics of misinformation, skepticism, and critical thinking.

“You get this misleading information that goes out there, it goes viral, people try to clean it up, people try to rebut it, but the rebuttals don't ever get quite the attention that the original one did," Professor Bergstrom told KUOW's Bill Radke on The Record.

It's not a new phenomenon. The infodemic did not start with the coronavirus pandemic. But despite misinformation going viral in the past, many have not built up an immunity to it.

pandemic infodemic misinformation
Enlarge Icon

Bergstrom likes to point to a more-than-250-year-old Jonathan Swift quote that is still relevant: "Falsehood flies, and truth comes limping after it, so that when men come to be undeceived, it is too late; the jest is over, and the tale hath had its effect."

For example, Bergstrom has also seen "news" stories stating that the virus was released from a lab in Wuhan.

"There is a parallel rumor going around China that it is an American bioweapon," he notes.

How does such information arise?

Bergstrom said a lesson can be learned out of Seattle. Rumors claimed that Seattle hospitals were overrun with Covid-19 cases in March. Ventilators were on short supply and doctors were deciding which patients would use them.

”None of that was happening here in Seattle," Bergstrom said.

Beware of the Lysol quarantini: Health workers quash bad information

The exaggerated stories took various pieces of good and bad information. Seattle and Washington state officials were planning for a surge in Covid-19 patients, eventually. There were concerns about a shortage of medical supplies and ventilators should conditions worsen. At the same time, Italian hospitals were being overrun with cases. All that info somehow snowballed into "Seattle is overrun."

"You get this aggregation of different levels of truth and untruth," Bergstrom said. "Ultimately, it provides a misleading message about what is happening.”

The Seattle example that Bergstrom cites is a level of rumor that initially struck the internet. But he says things have escalated since then.

"What we see is the same kinds of misinformation that we’ve seen for years, at least since the 2016 election where you have people trying to inflame existing divisions within the United States," he said. "Some of this activity has been associated in the past with Russian propaganda."

In such a case, "rather than trying to make you believe any particular thing about the virus, it’s just trying to exasperate the divides that are already existing in opinions."

Waves of misinformation

Kasprak has also observed trends while chasing leads at Snopes -- just when you think one bad piece of info is dead, it sails away on a wave of misinformation.

It began as the virus came to the United States around February.

“You know, people copy and pasting and sharing something they saw on somebody else’s feed, attributed to some random person -- ‘my friend’s mom is a nurse at Hopkins, and my uncle has a masters degree and worked in Wuhan’ or whatever," he said. "And they’ll just list all these scientifically-flawed facts, or tips on how to diagnose yourself, or what supposed cures could potentially help. That was sort of the earliest wave of it.”

That fed into the next wave of "quack" science.

“People pushing information suggesting this drug or that drug will work even though there is no, or limited, evidence,” Kasprak said. “But also your regular run-of-the mill ‘buy my immune-boosting supplement’ stuff.”

For fact-checker Kasprak, the most frustrating wave was next.

“More of a completely conspiratorial, anti-government, big brother surveillance, Covid denialism -- that sort of thing,” he said. “That’s popping up with a vengeance now.”

For example, all the talk about antimalarial drugs, with many claims that such drugs can fend off the virus. Kasprak was tasked with investigating one particular claim that involved drinking tonic water to prevent infection.

“Let me tell you how many steps you have to go through to make that work,” Kasprak said. “He was basing claims of efficacy on hydroxychloroquine (an antimalarial drug) … (this is an assumption) related to quinine which has historically been in tonic water, and still is in tonic water but in concentrations of no more than 83 parts per million."

"I did the math -- you would have to drink 12 liters of tonic water every eight hours for seven days for it to be at therapeutic levels," he said. "But there is also no science to support quinine in general. So it’s like three levels of nonsense.”

Immunity to misinformation

Just to be clear: There is currently little, if any, evidence to support the notion that an antimalarial drug will help fend off a coronavirus. Malaria comes from a parasite, and coronavirus is ... a virus.

In fact, there may be evidence to suggest hydroxychloriquine doesn't help at all, and could have harmful side effects. Either way, researchers with the University of Washington are currently studying potential uses of a malaria drug to treat Covid-19. Their findings will likely take months to form.

newspaper fake news infodemic
Enlarge Icon

But internet surfers and social media users may not take the time to look up such information. Such as more detailed information about antimalarial drugs and UW studies. Quick bits of info more often garner attention. They can be spread at the speed of a click or a like.

Bergstrom and Kasprak have advice for those who don't want to be duped.

"Recognize that there are a ton of rumors swirling around out there," Bergstrom said.

"It’s much much better to have a reliable story from a paper you trust, like the New York Times or the Wall Street Journal or whatever ... and it could be a day old, which is going to give you much better information than something that just started spreading across Twitter 15 minutes ago. Because it’s been properly vetted; they’ve talked with experts."

Kasprak echoed that sentiment.

“Don’t blindly share something that you haven't read," Ksaprak said. "Don’t share something that promises something that is too good to be true. Don't copy and paste health information that came from a friend, of a friend, of a friend."

"I think we need to hone in a little bit more on some critical reasoning skills. I think a lot of people are bored, cooped in, don’t have much else to do but read increasingly outlandish articles on the internet. Look at the facts and get back to reality -- that’s the best way moving forward.”