On the morning of July 8, former President Donald Trump took to Truth Social, a social media platform he founded with people close to him, to claim that he had, in fact, won the 2020 presidential election in Wisconsin, despite all evidence to the contrary.
Just 8,000 people shared that missive on Truth Social, a far cry from the hundreds of thousands of responses its Facebook and Twitter posts regularly generated before those services shut down their megaphones after the deadly riot on Capitol Hill on January 6, 2021.
And yet, Trump’s unsubstantiated claim pulsed through the public consciousness anyway. It jumped from its app to other social media platforms, not to mention podcasts, radio, or television.
Within 48 hours of Trump’s post, more than 1 million people saw his claim in at least a dozen other outlets. He appeared on Facebook and Twitter, from which he has been banished, but also on YouTube, Gab, Parler and Telegram, according to an analysis by The New York Times.
The spread of Trump’s claim illustrates how, ahead of this year’s midterm elections, misinformation has spread since pundits began sounding the alarm about the threat. Despite years of efforts by the media, academics, and even the social media companies themselves to address the problem, it is arguably more pervasive and pervasive today.
“I think the problem is worse than ever, frankly,” said Nina Jankowicz, a disinformation expert who briefly headed an advisory board within the Department of Homeland Security dedicated to combating disinformation. The creation of the panel sparked a furor, which led her to resign and the group was dismantled.
Not long ago, the fight against disinformation focused on major social media platforms like Facebook and Twitter. When pressed, they often removed worrying content, including intentional misinformation and disinformation about the COVID-19 pandemic.
Today, however, there are dozens of new platforms, including some that pride themselves on not moderating (censoring, as they say) false statements in the name of free speech.
Other figures followed Trump in migrating to these new platforms after being “censored” by Facebook, YouTube or Twitter. They included Michael Flynn, the retired general who briefly served as Trump’s first national security adviser; L. Lin Wood, a pro-Trump attorney; Naomi Wolf, feminist author and vaccine skeptic; and assorted followers of QAnon and the Oath Keepers, the far-right militia.
At least 69 million people have joined platforms, including Parler, Gab, Truth Social, Gettr and Rumble, that advertise themselves as conservative alternatives to big tech, according to company statements. Although many of those users are banned from larger platforms, they continue to spread their views, often appearing in screenshots posted on the sites that banned them.
“Nothing on the Internet exists in a silo,” said Jared Holt, senior research manager for hate and extremism at the Institute for Strategic Dialogue. “Whatever happens on alternative platforms like Gab, Telegram or Truth, come back to Facebook, Twitter and others.”
Changes in the disinformation landscape are becoming clear with the new US election cycle. In 2016, Russia’s covert campaign to spread false and divisive posts seemed like an aberration in the American political system. Today disinformation, from foreign and domestic enemies, has become a feature of it.
The unsubstantiated belief that President Joe Biden was not legitimately elected has become pervasive among members of the Republican Party, prompting state and county officials to impose new restrictions on the casting of votes, often based on mere conspiracy theories leaking into the right-wing media.
Voters must now sift through not only a growing torrent of lies and falsehoods about candidates and their policies, but also information about when and where to vote. Appointed or elected officials in the name of fighting voter fraud have put themselves in a position to refuse to certify results they don’t like.
Disinformation purveyors have also become increasingly sophisticated in circumventing the rules of major platforms, while the use of videos to spread false claims on YouTube, TikTok and Instagram has made them harder to track than text messages to automated systems.
TikTok, owned by Chinese tech giant ByteDance, has become a major battlefield in the current fight against misinformation. A report last month from NewsGuard, an organization that tracks the problem online, showed that nearly 20% of videos submitted as search results on TikTok contained false or misleading information on topics such as school shootings and Russia’s war in Ukraine.
“The people who do this know how to exploit loopholes,” said Katie Harbath, a former director of public policy at Facebook who now runs Anchor Change, a strategy consultancy.
With the midterms just weeks away, major platforms have pledged to block, label, or marginalize anything that violates company policies, including misinformation, hate speech, or calls for violence.
Still, the cottage industry of experts dedicated to countering disinformation — think tanks, universities, and nongovernmental organizations — say the industry isn’t doing enough. New York University’s Stern Center for Business and Human Rights warned last month, for example, that major platforms continued to amplify “election denial” in ways that undermined trust in the democratic system.
Another challenge is the proliferation of alternative platforms for such falsehoods and even more extreme views.
Many of those new platforms have flourished in the wake of Trump’s defeat in 2020, though they have yet to reach the size or reach of Facebook and Twitter. They portray Big Tech as beholden to the government, the deep state, or the liberal elite.
Parler, a social network founded in 2018, was one of the fastest growing sites, until the Apple and Google app stores launched it after the deadly riots on January 6, which were fueled by misinformation and so-called to online violence. He has since returned to both stores and begun to rebuild his audience by appealing to those who feel their voices have been silenced.
“We believe at Parler that it’s up to the individual to decide what he or she thinks is the truth,” Amy Peikoff, the platform’s director of policy, said in an interview.
He argued that the problem with misinformation or conspiracy theories stemmed from the algorithms platforms use to keep people connected online, not from the unfettered debate that sites like Parler foster.
Parler’s competitors now include BitChute, Gab, Gettr, Rumble, Telegram, and Truth Social, each offering sanctuary from the major platforms’ moderation policies on everything from politics to health policy.
A new survey by the Pew Research Center found that 15% of featured accounts on those seven platforms had previously been banned from others like Twitter and Facebook.
Nearly two-thirds of users of those platforms said they had found a community of people who share their views, according to the survey. Most are Republicans or lean Republicans.
One result of this atomization of social media sources is to reinforce the partisan information bubbles within which millions of Americans live.
At least 6% of Americans now regularly receive news from at least one of these relatively new sites, which often “highlight unconventional worldviews and sometimes offensive language,” according to Pew. One in 10 posts on these platforms that mentioned LGBTQ issues involved derisive accusations, the survey found.
These new sites are still marginal compared to the larger platforms; Trump, for example, has 4 million followers on Truth Social, up from 88 million when Twitter launched it in 2021.
Still, Trump has increasingly resumed posting with the vigor he once displayed on Twitter. The FBI raid at Mar-a-Lago put his latest statement in the eye of the political storm once again.
For the major platforms, the financial incentive to attract users – and their clicks – is still powerful and could undo the steps they took in 2021. There is also an ideological component. The emotional appeal to individual freedom partly fueled Elon Musk’s bid to buy Twitter, which appears to have been revived after months of legal maneuvering.
Nick Clegg, president of global affairs for Meta, the parent company of Facebook, even recently suggested that the platform could reset Trump’s account in 2023, before what could be another presidential bid. Facebook had previously said it would do so only “if the risk to public safety has subsided.”
Jankowicz, the disinformation expert, said the nation’s social and political divisions had stirred up the waves of disinformation.
Controversies over how best to respond to the COVID-19 pandemic have deepened mistrust of government and medical experts, especially among conservatives. Trump’s refusal to accept the outcome of the 2020 election led to violence on Capitol Hill, but did not end it.
“They should have brought us together,” Jankowicz said, referring to the pandemic and unrest. “I thought maybe they could be some kind of convening power, but they weren’t.”
This article originally appeared in The New York Times.