A handful of right-wing “super-spreaders” on social media were responsible for most of the electoral misinformation in the run-up to the Capitol attack, according to a new study that also sheds light on the staggering scope of the falsehoods pushed by Donald Trump. .
A report by the Election Integrity Partnership (EIP), a group that includes Stanford and the University of Washington, looked at social media platforms such as Facebook, Twitter, Instagram, YouTube, and TikTok for several months before and after the 2020 elections.
It found that the “super broadcasters,” responsible for the most frequent and shocking disinformation campaigns, included Trump and his two oldest sons, as well as other members of the Trump administration and the right-wing media.
The study authors and other researchers say the findings underscore the need to disable such accounts to stop the spread of misinformation.
“If there’s a limit to the amount of content that moderators can tackle, have them focus on reducing harm by eliminating the most effective spreaders of misinformation,” said Lisa Fazio, an assistant professor at Vanderbilt University who studies fake news psychology, but it was uninvolved EIP report. “Instead of trying to enforce the rules equally for all users, focus your application on the most powerful accounts.”
The report analyzed social media posts that included words like “election” and “vote” to track key narratives of disinformation related to the 2020 elections, including claims of postmen throwing ballots, legitimate ballots that are strategically not counted, and other false or unproven stories. .
The report studied how these narratives developed and the effect they had. It found that during this time period, popular right-wing Twitter accounts “transformed unique stories, sometimes based on honest voter concerns or genuine misunderstandings, into coherent narratives of systemic voter fraud.”
Ultimately, the “false claims and narratives merged into the meta narrative of a ‘stolen election,’ which then fueled the January 6 insurrection,” the report says.
“The 2020 elections demonstrated that actors, both foreign and domestic, remain committed to weaponizing false and deceptive viral narratives to undermine confidence in the American electoral system and erode Americans’ faith in our democracy,” the authors concluded. .
Almost no fact-checking, with Trump as the super spreader-in-chief
By monitoring Twitter, the researchers analyzed more than 22 million tweets sent between August 15 and December 12. The study determined which accounts were most influential by the size and speed with which they spread misinformation.
“Influential accounts on the political right rarely engaged in fact-checking behavior and were responsible for the most widespread incidents of false or misleading information in our data set,” the report said.
Of the top 21 offenders, 15 were verified Twitter accounts, which are particularly dangerous when it comes to election misinformation, according to the study. The “repeat broadcasters” responsible for the most widespread misinformation included Eric Trump, Donald Trump, Donald Trump Jr., and influencers such as James O’Keefe, Tim Pool, Elijah Riot, and Sidney Powell. The top 21 disinformation accounts leaned to the right, the study showed.
“Top-down misinformation and misinformation are dangerous because of the speed at which they can spread,” the report said. “If a social media influencer with millions of followers shares a narrative, they can get hundreds of thousands of interactions and actions before a social media platform or fact-checker has time to review their content.”
On almost every platform analyzed in the study, including Facebook, Twitter and YouTube, Donald Trump played a huge role.
It identified 21 incidents in which a tweet from Trump’s official @realDonaldTrump account prompted the spread of a false narrative on Twitter. For example, Trump’s tweets unsubstantiated that voting equipment manufacturer Dominion Voting Systems was responsible for voter fraud played a significant role in expanding the conspiracy theory to a wider audience. The false or unfounded tweets sent by Trump’s account, which had 88.9 million followers at the time, garnered more than 460,000 retweets.
Meanwhile, Trump’s YouTube channel was linked to six separate waves of misinformation that, combined, were the most viewed of any other repeat broadcaster’s videos. His Facebook account had the highest participation of all those studied.
The Election Integrity Partnership study is not the first to show the enormous influence Trump’s social media accounts have had in spreading misinformation. In one year, between January 1, 2020 and January 6, 2021, Donald Trump fueled disinformation in more than 1,400 Facebook posts, according to a Media Matters for America report published in February. Trump was finally suspended from the platform in January, and Facebook is debating whether he will ever be allowed to return.
Specifically, 516 of his posts contained disinformation about Covid-19, 368 contained electoral disinformation, and 683 contained damaging rhetoric attacking his political enemies. Election fraud allegations garnered more than 149.4 million interactions, or an average of 412,000 interactions per post, and accounted for 16% of interactions on his posts in 2020. Trump had a unique ability to amplify news than anyone else. so they would have remained contained in smaller media. and subgroups, said Matt Gertz of Media Matters for America.
“What Trump did was take the misinformation from the right-wing ecosystem and turn it into a widespread news event that affected everyone,” he said. “He was able to take these absurd lies and conspiracy theories and turn them into national news. And if you do that, and you inflame people often enough, you will end up with what we saw on January 6. “
Effects of false electoral narratives on voters
Ultimately, the “super-broadcasters” accounts were very successful in undermining voters’ confidence in the democratic system, according to the report. Citing a Pew Research Center poll, the study said that of the 54% of people who voted in person, about half had cited concerns about voting by mail, and only 30% of respondents were “very sure” that e-mail or absent -in the ballot papers had been counted as planned.
The report outlined a number of recommendations, including the total elimination of “super broadcast” accounts.
Outside experts agree that tech companies should take a closer look at lead accounts and repeat offenders.
The researchers said the refusal to take action or set clear rules about when action should be taken helped fuel the prevalence of misinformation. For example, only YouTube had a publicly declared “three strike” system for election-related offenses. Platforms such as Facebook reportedly had three-strike rules as well, but they did not publicly disclose the system.
Only four of the top 20 Twitter accounts cited as top propagators were actually removed, the study showed, including that of Donald Trump in January.
Twitter has maintained that its ban on the former president is permanent. YouTube’s chief executive stated this week that Trump would be reinstated on the platform once the “risk of violence” from his posts passes. Facebook’s independent oversight board is now considering whether to allow Trump to return.
“We have seen that he uses his accounts as a way to weaponize disinformation. It has already caused riots in the United States Capitol; I don’t know why I would give him a chance to do it again, ”Gertz said. “It would be a huge mistake to allow Trump to return.”