A Facebook search for the words “voter fraud” first turns up an article claiming that workers at a Pennsylvania children’s museum are brainwashing children into accepting stolen elections.
Facebook’s second suggestion? A link to an article on a site called MAGA Underground that says Democrats are plotting to rig next month’s midterm elections. “You should still be angry about the fraud that happened in 2020,” the article insists.
With less than three weeks to go until the polls close, misinformation about voting and elections abounds on social media, despite promises from tech companies to address a problem blamed for rising polarization and mistrust. .
While platforms like Twitter, TikTok, Facebook and YouTube say they have expanded their work to detect and stop harmful claims that could suppress voting or even spark violent confrontations, a review of some of the sites shows they are still playing catch. up to date with 2020, when then-President Donald Trump’s lies about the election he lost to Joe Biden helped fuel an insurrection in the US Capitol.
“You would think they would have learned by now,” said Heidi Beirich, founder of the Global Project Against Hate and Extremism and a member of a group called the Real Facebook Oversight Board that has criticized the platform’s efforts. “This is not your first choice. This should have been addressed before Trump lost in 2020. The damage is pretty deep right now.”
If these US-based tech giants can’t adequately prepare for a US election, how can anyone expect them to handle elections abroad, Beirich said.
Mentions of a “stolen election” and “voter fraud” have skyrocketed in recent months and are now two of the three most popular terms included in discussions about this year’s election, according to an analysis of social media, in line and streaming content made by the media. intelligence firm Zignal Labs on behalf of The Associated Press.
On Twitter, Zignal’s analysis found that tweets amplifying conspiracy theories about the upcoming election have been republished thousands of times, along with posts reaffirming discredited claims about the 2020 election.
Most major platforms have announced measures aimed at curbing voting and election misinformation, including labels, warnings, and changes to systems that automatically recommend certain content. Users who consistently violate the rules may be suspended. The platforms have also built partnerships with fact-checking organizations and media outlets like AP, which is part of Meta’s fact-checking program.
“Our teams continue to closely monitor midterms, working to quickly remove content that violates our policies,” YouTube said in a statement. “We will remain vigilant before, during and after Election Day.”
Meta, the owner of Facebook and Instagram, announced this week that it had reopened its election command center, which monitors real-time efforts to combat election misinformation. The company has rejected criticism that it is not doing enough and denied reports that it has reduced the number of staff focused on elections.
“We are investing a significant amount of resources, with work spanning more than 40 teams and hundreds of people,” Meta said in a statement emailed to the AP.
The platform also said that starting this week, anyone searching Facebook using election-related keywords, including “voter fraud,” will automatically see a pop-up with links to trusted voting resources.
TikTok created a polling hub earlier this year to help voters in the US learn how to register to vote and who is on their ballot. Information is offered in English, Spanish, and more than 45 other languages. The platform, now a leading source of information for young voters, also adds labels to misleading content.
“Providing access to authoritative information is an important part of our overall strategy to counter election misinformation,” the company said of its efforts to prepare for the midterm elections.
But policies intended to stop damaging misinformation about elections are not always applied consistently. False claims can often be buried deep in the comments section, for example, where they can nevertheless leave an impression on other users.
A report published last month by New York University criticized Meta, Twitter, TikTok and YouTube for amplifying Trump’s false statements about the 2020 election. The study cited inconsistent rules regarding misinformation, as well as an enforcement deficient.
Concerned about the amount of misinformation about voting and elections, several groups have urged tech companies to do more.
“Americans deserve more than lip service and half measures from the platforms,” said Yosef Getachew, director of Common Cause’s media and democracy program. “These platforms have been put together by enemies of democracy, both foreign and domestic.”
Election misinformation is even more prevalent on smaller platforms, popular with some conservatives and far-right groups like Gab, Gettr and TruthSocial, Trump’s own platform. But those sites have small audiences compared to Facebook, YouTube or TikTok.
Beirich’s group, the Real Facebook Oversight Board, put together a list of seven recommendations for Meta aimed at reducing the spread of misinformation ahead of the election. They included platform changes that would promote content from legitimate news outlets on partisan sites that often spread misinformation, as well as increased attention to misinformation targeting voters in Spanish and other languages.
Meta told the AP that it has expanded its fact-checking network since 2020 and now has twice as many fact-checkers in Spanish. The company has also launched a line of fact-checking tips in Spanish on WhatsApp, another platform it owns.
Much of the misinformation directed at non-English speakers appears to be aimed at suppressing their vote, said Brenda Victoria Castillo, executive director of the National Hispanic Media Coalition, who said efforts by Facebook and other platforms are not equal to the scale of the problem. that raises disinformation.
“They are lying to us and discouraging us from exercising our right to vote,” Castillo said. “And the people in power, people like (Meta CEO) Mark Zuckerberg are doing very little while benefiting from misinformation.”