What Europe Can Teach America About Russian Disinformation

Russian President Vladimir Putin gestures as he speaks to supporters during a rally near the Kremlin in Moscow, Sunday, March 18, 2018.

AP Photo/Pavel Golovkin

AA Font size + Print

Russian President Vladimir Putin gestures as he speaks to supporters during a rally near the Kremlin in Moscow, Sunday, March 18, 2018.

“If we are serious about defending Western values, now is the time.”

In 2014, United States officials encountered a new form of Kremlin disinformation in Ukraine. As “little green men” streamed into the country’s south, blatant falsehoods over anything from the history of World War II to weapon-system deployments spread across the internet and the airwaves. Propagandists disguised as professors, activists, and journalists sowed confusion about what was actually happening on the ground: soldiers bearing no flag had occupied strategic territory in eastern Ukraine. Intelligence collectors supplied propagandists with tapped calls and hacked emails containing compromising language, and the Kremlin leaked all of this to the media at key moments.

U.S. officials engaged in an aggressive campaign to build a global understanding of what was actually happening in Ukraine, and united Western allies in a chorus of condemnation. As a result, the West backed a sanctions regime that, remarkably, remains intact. But over time, with a peace process theoretically underway and the situation cooling, the State Department’s focus on counter-influence campaigns waned, and the unit leading the charge dissolved. Two years later, disinformation campaigns using very similar tactics targeted the U.S. electorate in the run-up to the 2016 vote, spreading so-called “fake news” and encouraging divisiveness in an effort to influence the election and American democracy itself.

As I watched the campaign unfold, it reminded me a bit of the propaganda around Israel’s Operation Cast Lead in 2009, but even more so of the foreign interference around the Brexit vote in 2016, and, of course, of Ukraine in 2014. The United States had failed to maintain its guard.

Related: How to Deal With Russian Information Warfare? Ask Sweden’s Subhunters

Meanwhile, governments in northeast Europe were ramping up their own efforts to resist disinformation. Earlier this year, I met with officials from eight countries bordering the Baltic to talk about disinformation. Northeast Europeans, by virtue of their location, experience, and foresight, have come to understand how to build resistance to disinformation: organizing campaigns to raise awareness of the problem; educating to resist propaganda; and coordinating across government agencies, civil society, and the media. Theoretically, these tactics are replicable in America in the current midterm election season.

Historically, the countries of northeast Europe have contended with more than their fair share of disinformation. Every government official I met with emphasized the importance of disinformation awareness—an understanding among officials, journalists, and the general public that they needed to be on guard against foreign propaganda. “We all thought in Poland that 300 years of partitioning and 50 years of communism made us immune to propaganda—not only Russian propaganda, but any kind of disinformation aiming at influencing the behavior of larger parts of society,” Jan Hofmokl, an official with Poland’s Ministry of Foreign Affairs, explained. “However, we have to admit it: We were caught off guard, and it took some time before we were able to admit there is a problem. Because there clearly is one.”

Academic research around disinformation isn’t new, but our empirical understanding of how to cope with the tactic remains limited. Northeastern University’s Briony Swire-Thompson researches the cognitive psychology behind disinformation effectiveness. “It is important to let the public know as soon as possible where the information comes from,” she explained. “This is because when deciding on whether information is true or false, people place a great deal of weight on the source of information.”

This was a common refrain in all of my conversations. Andris Mellakauls, the head of Latvia’s Information Space Integration Division, cited the government’s “permanent” campaign to promote media literacy as its proudest achievement. The campaign includes training for teachers, librarians, and municipal youth specialists; providing educational tools; and forging international partnerships to share best practices among journalists, researchers, civil servants, and NGOs. “Democracy can only function properly if citizens are able to make informed decisions,” Mellakauls said. “They must be aware of the sources of information they base their decisions on.”

Elina Lange-Ionatamishvili, an official at NATO’s Strategic Communication Center of Excellence in Latvia, agreed that education is essential, but argued this long-term approach should be matched with efforts to educate current voters, such as “social-advertising campaigns helping citizens to recognize fake news, disinformation, and also propaganda.” The EU’s East StratCom Task Force’s Disinformation Review is one example of such a campaign. “Governments have a great responsibility in setting the right policy priorities and allocating resources to enable the citizens to defend [themselves] from foreign disinformation campaigns,” Lange-Ionatamishvili said. “But at the end of the day each citizen is on their own when faced with the 21st-century information ‘deluge.’”

Given the emphasis on cyber defenses and social-media algorithms in many American conversations about disinformation, I was surprised by how rarely northeast-European officials emphasized technical solutions to the problem. While officials in many countries noted the relevance of technology, most were far more focused on their populations’ “psychological resilience” and viewed technological developments with a dose of fear. NATO’s Lange-Ionatamishvili worried about extremely realistic audio-video editing, as did Geir Hågen Karlsen, director of Strategic Communication and Psychological Operations at the Norwegian Defense University College. “In the future we will have to deal with Troll Factory 2.0: human trolls replaced by advanced bots and a few operatives,” and also “artificial intelligence, algorithms like natural language generation, manipulation of speech, imagery, and soon also video, higher speed, and most importantly, more sophisticated manipulation,” he said.    

In Estonia, even some technology experts advise against focusing too closely on technology. In the spring of 2017, Liisa Past, the chief research officer at the Cyber Security Branch of the Estonian Information System Authority, attended a meeting on i-Voting, Estonia’s remote voting system. The gathering focused on technology to protect the electoral process from cyber attacks. But Past felt her colleagues ignored non-technological, human vulnerabilities. “First I got scared,” she said, “then I got reasonably loud.” An attack on the vote-counting system itself was possible, she explained to her colleagues, but expensive. “Our adversary’s m.o. is to go for low-hanging fruit” such as political campaigns, she said. More realistic attacks would target the candidates and the parties, who lacked the technical expertise of the government, and the “news or information space layer,” sowing doubt and confusion among the electorate without needing to hack secure systems.

Finland made a concerted effort to educate its military officers, civilian officials, and journalists on the social science behind disinformation, in order to more thoroughly process influence campaigns and determine how to respond effectively. The government didn’t pursue purely technical solutions or quick fixes, but rather delved into the empirical evidence behind human vulnerability to disinformation. This allowed officials to confront the complexity of the problem and devise more thoughtful responses. This, too, is possible in America.

The Baltic experience can also inform what not to do. “First, [the West] should not submit to the temptation to shoot from the hip and engage in counter-propaganda,” Latvia’s Mellakauls said. Poland’s Hofmokl agreed: “We should be all afraid that we will end up in a dystopian world where propaganda is being fought by [more] propaganda.”

Unfortunately, many advantages that northeast European countries have in the war against disinformation may not exist in the United States. Their ability to implement whole-of-government solutions relatively easily is enviable, but challenging for a country as large and diverse as America. Similarly difficult to replicate are European levels of institutional trust. “The greatest strength the Danes possess,” Jesper Møller Sørensen, political director at the Danish Ministry of Foreign Affairs said, “is the high amount of trust toward our government and institutions in general. … This makes it harder to sow distrust and polarize debates,” he added.

Northeastern European tactics such as intensive disinformation awareness, better dissemination of research, and media-literacy campaigns could work in America—especially at the state level, particularly ahead of the midterm elections—as could better coordination between government agencies and civil society. But I worry that Europeans’ institutional trust is unlikely to cross the Atlantic any time soon. Just as dangerous is America’s polarized media landscape, where the European model of heavily subsidized, objective news is unlikely to catch on any time soon, if ever.

Perhaps the most important potential import from Northeastern Europe is an awareness of what is at stake. In the long run, the contest of narratives is about core postwar values such as multilateralism, individual rights, and the rule of law. “If we are serious about defending Western values, now is the time,” instructed Lithuania’s Foreign Minister Linas Linkevičius. Disinformation is a “very efficient weapon to demotivate people, to create doubt in leadership … the vacuum is never empty, but is filled by populism or nationalism or radicalism … we are just starting to realize this is real.”

Close [ x ] More from DefenseOne