Conspiracy theories about mail ballots. Anonymous text messages warning voters to stay home. Fringe social media platforms where election misinformation spreads with impunity.
Misinformation about the upcoming midterm elections has been building for months, challenging election officials and tech companies while offering another reminder of how conspiracy theories and distrust are shaping America’s politics.
The claims are fueling the candidacies of election deniers and threatening to further corrode faith in voting and democracy. Many of them can be traced back to 2020, when then-President Trump refused to accept the outcome of the election he lost to Joe Biden and began lying about its results.
“Misinformation is going to be central to this midterm election and central to the 2024 election,” said Bhaskar Chakravorti, who studies technological change and society and is the dean of global business at the Fletcher School at Tufts University. “The single galvanizing narrative is that the 2020 election was stolen.”
A look at key misinformation challenges heading into the 2022 election:
Misleading claims about voting
Political misinformation often focuses on immigration, crime, public health, geopolitics, disasters, education or mass shootings. This year, it’s mostly about voting.
Claims about the security of mail ballots have grown in recent weeks, as have baseless rumors about noncitizens voting. That’s in addition to claims about dead people casting ballots, ballot drop boxes being moved or wild stories about voting machines.
Trump, a Republican, attacked the legitimacy of the election even before he lost. He then refused to concede, spreading lies about the election that inspired the deadly Jan. 6, 2021, attack on the U.S. Capitol. His contention was rejected in more than 60 court cases and by his own attorney general, William Barr.
Together, these misleading claims about the nation’s electoral system have led some Republicans to say they’re going to hold onto their mail ballots until election day — a move that could slow down the count.
Others vow to monitor the polls to prevent cheating, leading to concerns about intimidation and even the possibility of violence at election sites.
Tech companies say they’ve implemented new policies and programs designed to ferret out misinformation.
“We’ve seen hundreds of elections play out on our platforms in recent years and we’ve been applying lessons from each one to strengthen our preparations,” Facebook and Instagram owner Meta said in a statement.
Yet critics say the volume of false claims spreading now shows there’s more to be done, such as better enforcement of existing rules or government regulations requiring more aggressive policies.
“This is no longer a new problem,” said Jon Lloyd, senior advisor at the nonprofit Global Witness, which last week released a report showing that TikTok failed to remove many advertisements that contain election misinformation. Big social media platforms, he said, “are still simply not doing enough to stop threats to democracy.”
Mistakes will happen — while the clock is still ticking
Elections involve the combined efforts of tens of thousands of people working under pressure. Mistakes are expected, which is why there’s a robust system of checks and balances to ensure errors are found and corrected.
Taken out of context, stories about glitchy voting machines, mixed-up ballots or even “suspicious” vehicles arriving at election centers can become fodder for the next election fraud myth.
And with so much work to do at such a fast pace, election workers, local officials and even the media can have little time to push back on such claims before they go viral.
In Georgia in 2020, a water leak at a site where ballots were being counted was used to spin a far-fetched tale of ballot rigging. In Arizona, the choice of pens given to voters filling out ballots led to similarly preposterous claims.
To avoid falling for a misleading claim, consult multiple sources, including local election offices. Any significant voting irregularity will be covered by multiple news outlets and addressed by election officials. Be skeptical of claims from second-hand sources, said Shaye-Ann McDonald, a behavioral researcher at Duke University who studies ways to improve resistance to misinformation.
The most viral misinformation often elicits anger or fear that motivates readers to repost it before they’ve had time to coolly consider the underlying claim.
“When you read about something that provokes a strong emotion, that should be a warning sign,” McDonald said.
A multilingual challenge
Just before the 2020 election, Spanish-language Facebook ads falsely claimed Biden, a Democrat, was a communist. On other platforms, posts warned Latinos in the U.S. not to vote at all.
Misinformation in non-English languages is a particular concern cited by researchers who say the major platforms — most of them U.S.-based — are focused on content moderation in English. Automated systems written to detect misinformation in English don’t work as well when applied to other languages.
“As bad as [tech companies] are moderating content in English, they’re even worse when it comes non-English languages,” said Jessica Gonzalez, co-chief executive of Free Press, a nonprofit that works on issues of racial justice and technology.
Misinformation by text?
While misinformation about elections spreads easily on big social media platforms like Facebook, it also has taken root on a long list of less familiar platforms: Gab, Gettr, Parler and Truth Social, Trump’s platform.
Meanwhile, TikTok has emerged as a key network for younger voters — and the politicians who want to reach them. The platform, owned by a Chinese company called ByteDance, has created an election center to connect users with trustworthy information about elections and voting. But nonetheless misinformation persists.
The problem isn’t limited to social media. The number of false claims transmitted by text and email has steadily increased in recent years. Last summer, Democratic voters in Kansas received misleading texts telling them a yes vote on an upcoming referendum would protect abortion rights; the opposite was true.
Musk and Twitter
Elon Musk’s purchase of Twitter just weeks before the 2022 election upended that platform’s plans for combating misinformation ahead of the midterms.
Musk quickly fired the executive who had overseen content moderation. Over the weekend he posted a tweet advancing a baseless conspiracy theory about the attack on Paul Pelosi, the husband of House Speaker Nancy Pelosi (D-San Francisco), before deleting it.
Musk has called himself a free speech absolutist and had said he disagreed with the decision to boot Trump from the platform for incitement of violence on Jan. 6, 2021.
He has said that a content moderation committee will examine possible revisions to Twitter’s rules, but that no changes would be made until after the election.
“We’re staying vigilant against attempts to manipulate conversations about the 2022 US midterms.” Yoel Roth, Twitter’s head of safety and integrity, tweeted Tuesday.
Threats foreign and domestic
Russian efforts to interfere in U.S. elections go back years, and there are indications that China and Iran are stepping up their game.
Tech companies, government officials and misinformation researchers say they’re monitoring for such activity ahead of the midterms. But the misinformation threat posed by domestic groups may be far greater.