Skip to main content

You make this possible. Support our independent, nonprofit newsroom today.

Give Now

How the fight to stop election misinformation morphed into a free speech battle

caption: Prof. Kate Starbird is the director and co-founder of the University of Washington's Center for an Informed Public.
Enlarge Icon
Prof. Kate Starbird is the director and co-founder of the University of Washington's Center for an Informed Public.
Photo courtesy of UW Center for an Informed Public

Since the 2020 election, conservative lawmakers and staffers have led a campaign against misinformation and disinformation researchers like Kate Starbird, the director and co-founder of the University of Washington's Center for an Informed Public, arguing that the researchers are attempting to censor them, thus violating the First Amendment.

But that argument itself would be misinformation, according to Starbird.

“I was like, ‘Oh my gosh, I’m being gaslit. By a congressional committee.’”

A congressional subcommittee, to be exact. The U.S. House Judiciary Select Subcommittee on the Weaponization of the Federal Government, which was established on Jan. 10 with Ohio Republican Rep. Jim Jordan at the helm.

In June, this committee filed into a room on Capitol Hill to hear testimony from Starbird, who's spent the last decade working to track and understand how misinformation and disinformation spreads online. In the course of that work, Starbird was asked to chair a subcommittee of her own, one meant to ensure the integrity of federal and state elections. This followed the establishment of the Cybersecurity and Infrastructure Security Agency, or CISA, under the U.S. Department of Homeland Security, which was signed into law during former President Donald Trump’s administration.

In 2020, CISA began to address election misinformation and disinformation. Starbird and other academics helped flag false election and Covid-19 content for social media platforms and election officials. It’s that work that has put Starbird at the center of a political mess.

Jordan's Subcommittee has alleged CISA, Starbird, and her colleagues colluded with tech companies to subvert Americans' First Amendment rights and participated in “censorship,” namely of conservative or right-wing views.

“The vast majority of misinformation about election processes and procedures in 2020 was spread by people who were supporters of Donald Trump and/or conservative,” Starbird told KUOW. “That is not because our team has bias. That is what happened in the world. But it opens us up for attack of bias because the thing that we're studying is an asymmetrical phenomenon right now.”

It wasn’t always that way.

When Starbird began her research in 2013, she and others weren’t specifically studying far-right spaces, she said. Rather, their research showed people across the political spectrum could be susceptible to misinformation and disinformation.

After the 2020 presidential election, though, election misinformation spread on the right – namely “The Big Lie” that President Joe Biden “stole” the presidency from Trump, a baseless allegation – had an unprecedented consequence: the attack on the U.S. Capitol on Jan. 6, 2021. Some lawmakers, including Rep. Jim Jordan, helped spread the lie that Trump won the 2020 election and have defended the people who attacked the Capitol that day.

"Right now, on the right in the United States, there are people who have political power, who are using these things strategically to gain and maintain that power," Starbird said. “And we're calling it out.”

Attacks on Starbird’s work haven’t stopped her from continuing her research. The landscape she’s operating in has changed dramatically since 2020, though.

The basic problem is people are vulnerable heading into the 2024 election, Starbird said.

“They've come to see democracy through this frame that elections can be rigged. And once you begin to see things that way, you twist every piece of evidence into fitting that frame,” she said. “So, people are looking for evidence to see that they're being cheated.”

(Read more about how "frames" may affect the spread of misinformation and disinformation here.)

caption: The drive-through ballot drop box at King County elections in Renton.
Enlarge Icon
The drive-through ballot drop box at King County elections in Renton.
David Hyde / KUOW

The irony is that this can make elections less secure, Starbird said. Election misinformation and disinformation can be used to undermine checks and balances meant to keep the system secure.

For example, as of October, nine Republican-led states have pulled out of the cross-state partnership known as the Electronic Registration Information Center, or ERIC. The system allowed election officials to share voter registries across states to see whether voters were registered in multiple states at once. Starbird said conspiracy theories about ERIC had led those states to pull out, thereby removing a tool designed to make their systems more secure.

The tools available to Starbird and other researchers to fight the proliferation of such conspiracy theories are now limited. For example, researchers previously had access to publicly available Twitter data, allowing them to see how misinformation and disinformation spread in real time. Now, under Elon Musk’s X, formerly known as Twitter, that data is no longer available.

“As researchers, we don't have really any visibility into what's happening on that platform, or very limited visibility, which means it's hard to keep track of in real time,” Starbird explained. “Like, what are the rumors that are taking off? And it's hard to know what rumors to correct if you don't know what rumors are taking off.”

That may sound like an academic problem, but it’s one that is affecting elections workers, too.

Misinformation and disinformation campaigns undermine trust in the process, Starbird said. That’s keeping her up at night, and as she works to keep up with these machines, she’s also battling her own government.

“Influencers on the right, including political actors, have very effectively twisted this idea of platform moderation and redefined it as censorship,” she said, referring to Jordan’s subcommittee targeting her work. "They've twisted the narrative actually in a couple of different places to even have you ask the question of whether ... I or my team was part of censorship.”

(Read Starbird's full response to Jordan's accusations here.)

caption: An elections worker removes ballots from a ballot sorting machine on Wednesday, October 28, 2020, at King County Elections in Renton.
Enlarge Icon
An elections worker removes ballots from a ballot sorting machine on Wednesday, October 28, 2020, at King County Elections in Renton.
KUOW Photo/Megan Farmer

In the meantime, Starbird argued Jordan’s claims about her and other researchers have been part of “a very effective effort to chill the speech of academic researchers and others who have tried to call out the spread of misinformation and disinformation.”

Starbird won’t stop her research, though. And while she prepares for 2024, the U.S. Supreme Court may soon have something to say on the matter.

A ruling out of the Fifth Circuit Court of Appeals regarding government communication with social media companies will be heard by the U.S. Supreme Court during the 2023-24 term. The case stems from Republican complaints that social media companies censor conservative views. The panel of judges, all Republican nominees, found that the Biden administration's efforts to flag false and harmful content about Covid, the 2020 election, and other topics likely amount to a violation of the First Amendment.

In October, the Supreme Court agreed to hear the case and granted the Biden administration’s request to block the lower court’s order limiting government officials’ ability to communication with social media companies about their content moderation policies.

The short version: The Supreme Court may soon weigh in on the kind of research Starbird has done for years and whether, according to the justices, this work has infringed on anyone’s First Amendment right to free speech.

“The spread of false claims about the 2020 election had detrimental effects on democracy,” Starbird said. “We have to come together to govern ourselves. And to do that, we don't have to agree on everything, but we have to have some sense of a shared reality. We have to know what the rules are. And we have to trust that the outcomes of elections are the outcomes of elections. And we have to buy in or democracy doesn't work.”

Why you can trust KUOW