People standing with a a group of protesters wearing shirts with the logo of the far-right Proud Boys group flash hand signs that are known to signify "White Power" during a small protest against Washington state's stay-at-home orders, Friday, May 1, 2020

People standing with a a group of protesters wearing shirts with the logo of the far-right Proud Boys group flash hand signs that are known to signify "White Power" during a small protest against Washington state's stay-at-home orders, Friday, May 1, 2020 AP Photo/Ted S. Warren

Right-Wing Extremism and Islamic Extremism Spread Online In Similar Ways, New Study Says

Those crazy memes you keep seeing? Hate groups are using them to attract online recruits into small, intense groups — and beat Facebook’s censors.

Right-wing American extremists and white supremacist groups are adopting the same ISIS-favored ways to attract and radicalize their members online, according to a new study. 

Under Trump, extreme right-wing and white nationalist groups have grown and felt emboldened to emerge from their historic shadows into public fora. But online, in the past year their methods appear to have been borrowed, or at least mimic, how violent jihadists around the world recruit and reach their own members. 

“While ISIS is a well-established and centralized group with a hierarchical organizational structure, the [tactics, techniques, and procedures] they leverage to organize, recruit, incite action, and disseminate information online appear to have been adopted by several emergent radical domestic groups in the United States,” says the new report from data analytics company Babel Street, obtained exclusively by Defense One.

Researchers looked at several groups with various beliefs, structures, and intensities, including the Atomwaffen Division, a fairly closed neo-Nazi group, and the Not Fucking Around Coalition, or NFAC, is a heavily-armed, anti-government group with a strong central, charismatic leader who produces podcasts and daily messages for members. They contrast with the Boogaloo Bois, a decentralized movement with no strong leadership, held together in the shared conviction that social tension in the United States along racial and economic lines will lead to violent upheaval and a second civil war — while wearing Hawiian shirts

Different as these groups are, after a year of reviewing content posted across pages, forums, and inside the channels belonging to the groups, Babel Street found significant overlap between members and a common set of tactics, techniques, and procedures for sharing content and drawing members and potential members closer together within the group or subgroup. Many of these techniques were pioneered by ISIS, which began its recruitment surge by putting violent imagery online via conventional social media and, from there, creating smaller and more active groups and networks. As ISIS was increasingly pushed off of platforms like Facebook, they pushed group members to encrypted channels on apps like Telegram.

While the content that right-wing groups share is very different than ISIS content and much less likely to be obviously violent (more likely to be meme-based, especially in the case of the Boogaloo Bois) that process of reaching members via popular open sites and forums and then pushing them into smaller, more intimate digital groups comes from the ISIS playbook, said Tucker Holmes, a senior solutions specialist with Babel Street. 

“They create these memes, spread them across social media, and then these memes lead people who are sharing them to find groups who are sharing similar content. From there, they might find another small outlet or group where people are more openly talking. They’re slowly pulled into the community, that way.” 

That process of herding members from large groups to smaller ones achieves several things. Perhaps most importantly, it helps the group continue to survive as platforms like Facebook, Twitter, and YouTube seek to block them. 

“You might have a bunch of people that follow one account and then that account will say, ‘Hey. I created a channel on Discord. Or I created a channel on Telegram. If you’re interested, sign up.’ From there, they would push people to other platforms where they can be sure to continue reading group content” said Holmes. That content could be anything from propaganda or more memes, at the broadest level, to the sharing of descriptions of weapons and survival or tactical gear within groups, at the smaller level. 

To the extent that a group has leaders, such as the NFAK coalition, the threat of deplatforming can serve as an accelerant to join the networked smaller groups. Members are especially encouraged to join Telegram channels or additional message boards in order to stay in touch after banning. 

All of the groups that Babel Street surveyed seemed to show some awareness that they’ve attracted scrutiny from law enforcement, which affects their behavior in different ways. The Atomwaffen Division is very careful to avoid overt “recruitment,” said Holmes. “They’ve been burned in the past,” and so they take “a decentralized, grassroots approach.” Meanwhile the leader of NFAK coalition wants to launch formal chapters nationwide, but —aware that the audience might include law enforcement —he emphasizes that the group is expressly looking for “law-abiding gun owners.” 

Facebook, Youtube, and Twitter eventually succeeded in purging their respective platforms of violent jihadist content, in part through the use of machine leanring. In June, Facebook took the step of banning groups with ties to the Boogaloo Bois, but the same machine learning techniques that are effective against extremist Islamic content are much less effective against right-wing content. First, Jihiadists would share the same specific videos. Once one social media company’s moderators caught it they could affix a digital hash to the image and share that with other companies so that the imagery could be detected automatically and removed almost as soon as it was posted (similar to social media companies’ methods for policing child pornography.) 

But rightwing extremist content takes a variety of forms and only a portion of it is expressly violent. Context is also key. Someone sharing photos of historical Nazis or images associated with the Confederacy could be an extremist or just having a neutral discussion about history ,or sharing current news articles. And, of course, individuals who share such content could just be nonviolent individuals that are sympathetic to rightwing causes.

There-in lies the real challenge of stopping the spread of hate language on social media. 

Don't miss:

Facebook CEO Mark Zuckerberg has erred on the side of what he described in an October speech at Georgetown University as “free expression,” a stance that has invited criticism from a coalition of civil rights groups such as the NAACP and the Anti-Defamation League. Legislative efforts to deal with right-wing networks have been absorbed by a larger fight over whether platforms like Twitter and Facebook should lose protections for attempting to ban certain groups at all, a debate pushed increasingly by right-wing groups and some lawmakers who complain that their views are being censored. 

Facebook didn’t mind losing business from a few Jihadists. Alienating large pockets of American conservatives, very few of whom are violent but many of whom are already nurturing feelings of collective persecution, is another matter.