15 Things We Learned from the Internet Giants

Representatives from Google, Twitter and Facebook are sworn in for a Senate Intelligence Committee hearing on Russian election activity and technology, on Capitol Hill in Washington, Nov. 1, 2017.

Jacquelyn Martin/AP

AA Font size + Print

Representatives from Google, Twitter and Facebook are sworn in for a Senate Intelligence Committee hearing on Russian election activity and technology, on Capitol Hill in Washington, Nov. 1, 2017.

The key takeaways from three days of testimony about Russia’s electoral mischief during the 2016 election.

During three Congressional hearings spread over two days, we heard a lot of bluster from senators and pat answers from tech-company lawyers about the role their firms played in the 2016 election.

Scattered among all the questions, some new facts entered the public record. Here we attempt to catalog the important new information we learned. Some of the biggest disclosures came in the prepared testimony from Facebook, Twitter, and Google, as well as in the introduction from the ranking members of the Senate Intelligence Committee, Senator Richard Burr of North Carolina and Senator Mark Warner of Virginia.

  1. Russian electoral disinformation reached 126 million people on Facebook and 20 million on Instagram. That’s 146 million total.

These topline numbers keep going up, and we hadn’t known that the influence campaign extended to Instagram. This information seems to have only reached the Senate committee in the last couple of days.

  1. Most Russian advertising on Facebook was used to build up pages, which then distributed their content “organically.”

The $100,000 of advertising that has been a big focus of Congressional interest was used primarily to build audiences for a variety of Russian-linked pages. In other words, they paid to buy likes and build the distribution channels through which they would pump disinformation.

  1. Some of the Russian-linked Facebook ads were remarkably effective, receiving response rates as high as 24 percent, in a sample of 14 ads released by the House Intelligence Committee.

Fourteen ads and the metadata that Facebook provided about them have now passed into the public domain. An analysis of that metadata shows that the ads racked up very few impressions, but that the click-through rate on the ads was very, very, very high. According to a couple dozen digital-marketing people whom I’ve been in touch with, as well as my own direct previous experience, this is about the maximum possible performance one could get.

  1. 3.3 million Americans directly followed one of the Russian Facebook pages.

As a result of the ad campaign and the evident audience-building skill of the Russian trolls, approximately 3.3 million Americans ended up following a Russian page, based on Facebook’s data, according to Senator Warner. That’s a lot of people. So far, Facebook has not committed to notifying any of them or us.

  1. Despite that, with the evidence on hand, it would be impossible to say that the campaign swung the election.

Even given the skill and reach of the Russian disinformation campaign, based on what we know, it is highly unlikely that the ads—or the campaign as a whole—swung the election. The amount of Russian content on Facebook was a tiny sliver of the overall content (or political discussion) on the platform.

To swing the election, the campaign would have had to be highly targeted in the states that decided the election: Wisconsin, Michigan, and Pennsylvania. Senator  Burr, the Senate Intelligence Committee chairman, opened the hearing with some important numerical context. The ad spends in those states were tiny. The total amount spent targeting Wisconsin was a mere $1,979; all but $54 was spent prior to the completion of the primary, and none of the ads even mentioned Trump. The spending in Michigan and Pennsylvania was even smaller. The organic reach in these states was undoubtedly larger, but based on everything we’ve seen or been told by Congress, and given the tremendous resources at play in the U.S. presidential election, the known aspects of the Russian disinformation campaign could not have played a major role even in the states that were decided by very few votes.

All of this is premised on the idea that this is all there is to the disinformation campaign. That could turn out to be wrong, but this is all that Congress has managed to extract from the companies, and (presumably) all that the companies have managed to extract from themselves.

  1. Neither Facebook nor Twitter has seen evidence that Russian pages used voter data to target ads or posts.

There had been some speculation that the most effective way to swing the election would be to target small numbers of voters in the three key northern states. That may very well be true, but Facebook and Twitter both said they had seen no evidence of the voter file being used to build specific audiences.

  1. None of the platforms were dealing with the specific Russian electoral-disinformation campaign before the election and the ensuing intelligence-community report.

In response to direct questioning, all three companies said that while they were dealing with cybersecurity and espionage of various kinds for years, the specific techniques that the Russian-linked pages used were not on their radar until 2017.

  1. None of the companies have provided full-fledged support for the only legislation currently on the table, the Honest Ads Act.

The only legislation on the table is the Honest Ads Act. While all three company representatives indicated that they were taking approaches that were consistent with the act, none were willing to commit to supporting it.

  1. In at least one instance, Russian groups created dueling events that led to a real-life confrontation, in this case at an Islamic center in Houston.

Senator Warner opened his committee remarks describing a bizarre moment when two Russian troll groups created competing events on May 21, 2016, at an Islamic center. The Heart of Texas page created a Facebook event “to stop the Islamization of Texas,” while the United Muslims of America created an event at the Islamic center. People who’d seen the event listing showed up on both sides, and it was not a friendly encounter.

  1. Facebook may not know precisely who was targeted by Russian ads, or even who was directly following all the pages that they’ve linked to the Internet Research Agency.

Fascinatingly, when pressed on why Facebook had not notified people who had been reached by Russian propaganda or who were directly following a Russian-run page, Facebook’s general counsel, Colin Stretch, said the technical challenge was greater than the committee understood. “The technical challenges associated with that undertaking are substantial,” Stretch said, “particularly because much of the data work underneath our estimate of the number of people who may have been exposed to this relies on data analysis and modeling.”

This is a surprise? One would think these things were stored in a database archive somewhere within the Facebook system, but this answer, at least, indicates that that’s not actually the case. It seems possible that Facebook doesn’t know, although not everyone else in the tech industry finds this plausible.

  1. Facebook does not appear to have checked whether ads created by the known Russian pages were also run by other pages or accounts.

One impressively specific line of questioning came from Representative Eric Swalwell of California, who represents a chunk of the East Bay across the water from Silicon Valley. He asked if Facebook had checked to see whether the ads that the Russian pages had run were run by any other pages, known to Russian or not. This “duplicate search,” as he called it, might show the wider network that the Russians used, knowingly or not. Facebook did not have an immediate answer.

  1. Russian trolls have continued to post content, including items related to postelection demonstrations, the Electoral College, the NFL kneeling dispute, more-general racial issues, and immigration.

Senator Angus King of Maine noted that the Russian campaign had not stopped cold turkey, and he pushed the companies to discuss what other issues the trolls were pushing now. They noted the topics above.

  1. Google did not revoke RT’s YouTube “Preferred” status because of its state links, but rather because of falling viewership.

In a rather strange exchange from yesterday, Senator Dianne Feinstein of California challenged Google to explain RT’s inclusion in YouTube’s “Preferred” program—and why it took so long for Google to pull the Kremlin-backed site. Richard Salgado, the counsel representing Google at the Senate Judiciary Committee hearing, explained that that was not what happened.

“Russia Today qualified, really because of algorithms, to participate in an advertising program. There are objective standards around popularity to be able to participate in that program. Platforms or publishers like RT drop in and out of the program as things change,” Salgado said. “The removal of RT from the program was actually the result of dropping viewership, not as a result of any action otherwise. There was nothing about RT or its content that meant that it stayed in or stayed out.”

Feinstein was not happy. And in yesterday’s Senate Intelligence Committee hearing, she said so. Kent Walker, Google’s counsel, reiterated the position, and also turned the attention on the other people who take money from RT. “RT’s channel is on major cable-television networks, on satellite networks. Its advertisements appear in newspapers, magazine, airports,” Walker said. “It is run in hotels pretty much everywhere in the United States.” In part, this exchange led Feinstein into a finely calibrated rant.

“I must say, I don’t think you get it,” she said. “What we’re talking about is a cataclysmic change. What we’re talking about is the beginning of cyberwarfare.”

  1. The Russian campaign ads were all paid for in rubles.

It appears that all of the ads Facebook handed over to Congress were paid for in rubles. Senator Al Franken of Minnesota made hay on Tuesday with this, asking how the company couldn’t connect the nature of the pages with the currency of the ad purchases. But the real reason it’s surprising is that the trolls seem to be quite skilled, and yet they purchased ads in rubles. Now the question is: Did they want to leave a calling card? Did they just not think Facebook would check? (They were right.) Or was it a half-sophisticated, half-janky operation? It could be all three.

  1. Twitter says it automatically takes down 95 percent of terrorist accounts, 75 percent of them before they ever tweet.

All three tech companies advanced a similar argument about their ability to deal with the disinformation problem on their own. They pointed to their success reducing terrorist messaging and child pornography. All three have developed sophisticated techniques for battling these evils, and they suggested that if they’re allowed to spend more time developing analogous technologies for electoral interference, they’ll succeed. Twitter had the best single data point supporting them, when its acting general counsel, Sean Edgett, noted the success they’ve had shutting down terror networks.

Close [ x ] More from DefenseOne