Twitter CEO Jack Dorsey appears on a screen as he speaks remotely during a hearing before the Senate Commerce Committee on Capitol Hill, Wednesday, Oct. 28, 2020, in Washington.

Twitter CEO Jack Dorsey appears on a screen as he speaks remotely during a hearing before the Senate Commerce Committee on Capitol Hill, Wednesday, Oct. 28, 2020, in Washington. Greg Nash/Pool via AP

Twitter Bots Promote Right-Wing Conspiracies, Paper Shows

Almost 13 percent of all users that endorse conspiracy narratives are bots, say researchers.

Twitter bots are nearly twice as likely to amplify right-wing content than are humans, a new paper finds, shedding light on how these largely automated social media personas can shape public opinion.

About 13 percent of all Twitter users that retweeted or engaged in conspiracy theories were bots, according to researchers from the University of Southern California who looked at more than 240 million election-related tweets from June 20 to Sept. 9. The USC team analyzed the tweets using Botometer,an online tool developed at Indiana University, which analyzes a Twitter account’s behavior to score the probability of the probability that the account is a bot. The paper, published this week in the online journal First Monday, shows that right-wing bots outnumber left-wing bots roughly two-to-one. 

So how many Twitter users talking about politics are bots? Only about 5 percent, but they drive a lot of conversation. 

“We see a backdrop of approximately 5% bots in right-and-left leaning discussions, and 10-15% bots associated with conspiracy groups that are responsible for as much as 20% of volume of engagement with hyperpartisan news websites,” said the study’s lead author, Emilio Ferrara, who is an associate professor of Communication and Computer Science at USC’s  Annenberg School for Communication and Journalism.

Ferrara’s team found that both right-wing and left-wing humans who use Twitter exist largely in filter bubbles where they are much more likely to see and spread content from and to their chosen affiliated group, confirmed their  pre-existing convictions and beliefs. 

“Our analysis shows that 35 percent of retweets are left-leaning humans retweeting other left-leaning humans; 53 percent of retweets are right-leaning humans retweeting other right-leaning humans,” they write.

By examining banned users, the researchers found indications that “Ghana and Nigerian information operations” were targeting people who were posting as part of the Black Lives Matter movement, “while Saudi Arabia and Turkey both boast high levels of engagement with right-leaning users.” They found that Russia and China targeted “fringe communities and conservative groups more prominently.” Even though the accounts that the team inspected had been banned, they can indicate an on-going effort. 

Lindsay Gorman, an emerging tech fellow at the bipartisan Alliance for Securing Democracy,  said, “The range of actors that are jumping into the online influence game underscores just how much the information arena is an important locus of competition and the diffusion of nation state power in the 21st century.”

The researchers also found that users who identified as right-wing were several orders of magnitude more likely to share conspiracy theories than were left-wing users, based on the use of hashtags associated with a trending conspiracy, such as #obamagate, which referred to speculation that the Obama administration illegally surveilled the Trump campaign. Users that endorsed highly right-wing media outlets were also much more likely to pass along conspiracy theories.

“Almost a quarter of users who endorse predominantly right-leaning media platforms are likely to engage in sharing conspiratory narratives. Out of all users who endorse left-leaning media, approximately two percent are likely to share conspiratory narratives,” they write. 

Nearly 13 percent of users promoting conspiracy content were bots. The researchers see that as good and bad. “On the one hand, this is good news: not all the popularity of political conspiracies is genuine; on the other hand, since bots can inflate narratives and bring organic attention to unsuspecting users, the high prevalence of bots in conspiracy narratives is a problem that requires urgent attention: Twitter is currently considering... preventative measures, like suspending QAnon accounts, to hinder the spread of political conspiracies, but ours and others’ work suggests this may not be enough,”  they write.

“The findings of the study corroborate what disinformation researchers have come to believe that bots are not the primary source of viral disinformation. Organic actors present more compelling targets for misinformation and disinformation in part because they are far more difficult for platforms to address under their policies. That's why it's becoming impossible to decouple foreign influence operations from the vulnerabilities of our domestic information ecosystem,” said Gorman.