“We uncorked a tiny bottle of champagne…took one gulp each and looked into each other’s eyes…. We uttered almost in unison: ‘We made America great.’” Those were the words of a Russian troll working for the Kremlin-backed Internet Research Agency, or IRA, on the morning of November 9, 2016, according to a report released Tuesday by the Senate intelligence committee.
Quotes aside, close followers of Russia’s disinformation practices won’t find a ton of new information there, but the report from the Republican-led Senate Select Committee on Intelligence is nonetheless significant. In recent months, President Donald Trump has intensified his efforts to undermine the intelligence community consensus that Russia meddled in the 2016 presidential election. On July 25, Trump asked his Ukrainian counterpart to help push a conspiracy theory that the U.S. cybersecurity company Crowdstrike worked with Ukranians and Democrats to frame Russia for election meddling. In issuing the report, the committee chaired by Sen. Richard Burr, R-North Carolina, is refuting Trump and his theory.
It also undercuts some popular misperceptions about how Russian influence campaigns operate. For instance, it notes that Russian influence operators generally target Americans across the political spectrum, going after right- and left-wingers with different lines of misinformation and disinformation. Race and racial discord are their most active areas. In fact, the report says, blacks were more likely to be targeted with racial disinformation than whites.
“Evidence of the IRA’s overwhelming operational emphasis on race is evident in the IRA’s Facebook advertisement content (over 66 percent contained a term related to race) and targeting (locational targeting was principally aimed at African Americans in key metropolitan areas),” it says.
The IRA’s “Blacktivist” Facebook page generated 11.2 million engagements, the report says. Five of the IRA’s top 10 Instagram accounts focused on black issues and audiences. Its Twitter and YouTube channels likewise focused heavily on racial issues.
The Russians also created nonpolitical pages built around religion and other themes. Once these pages attracted followers and gained a certain amount of their, the operators would introduce propaganda.
“The tactic of using select payload messages among a large volume of innocuous content to attract and cultivate an online following is reflected in the posts made to the IRA’s ‘Army of Jesus’ Facebook page,” the report says. “The page, which had attracted over 216,000 followers by the time it was taken down by Facebook for violating the platform’s terms of service, purported to be devoted to Christian themes and Bible passages. The page’s content was largely consistent with this facade.”
Russian operators also spend a lot of time attacking the credibility of traditional media outlets in order to push false narratives. The goal isn’t necessarily to sway public opinion, but to numb the public to warnings about Russian activity.
The report’s most important takeaway is that the Russian efforts are continuing. “An October 2018 report provided to the Committee by social media analytics firm Graphika indicates that Russian disinformation efforts may be focused on gathering information and data points in support of an active measures campaign targeted at the 2020 U.S. presidential election,” it says. Russian actors appear to be “engaged in a number of campaigns seemingly focused on gathering personal information (emails, phone numbers, and bank details) of US-based audiences sympathetic to Russian disinformation topics.”
But there’s still disagreement about how Congress should limit the ability of Russians to use social media to influence the American populace. In an annex, Sen. Ron Wyden, D-Oregon, goes further than other senators in proposing legislation to restrict social media platforms’ right to sell access to users who match specific characteristics.
“lf American democracy is going to withstand the onslaught of foreign government influence campaigns targeting· U.S. elections, our government must address the problem of targeted ads and other content tailored to consumers’ demographic and political profiles,” Wyden writes. “Targeted influence campaigns can weaponize personal information about Americans, not just to manipulate how, or whether, they vote, but to identify and use real individuals to amplify content and influence like-minded followers. Targeted influence campaigns are far more effective and cost-efficient than blanket dissemination of propaganda. They are also more deceptive and substantially harder to identify and expose.”