U.S. veterans have for years been targeted by overseas actors aiming to sway political opinion, extract information, and, sometimes, run petty scams, experts from the veterans and tech communities told lawmakers on Wednesday.
Those actors — from Russia, Ukraine, Bulgaria, Macedonia, and elsewhere — often gain access to Facebook groups and other online forums by impersonating veterans they find online, Kristofer Goldsmith, chief investigator and associate director of policy and government affairs at the Vietnam Veterans of America, told the House Committee of Veterans Affairs.
“Since beginning our investigation, we’ve found and exposed election interference related to the 2020 presidential race by these foreign entities,” Goldsmith said. “These criminals frequently steal veterans’ deployment photos and use them to create online social media profiles. They then use those imposter profiles to enter online groups which are made for grieving Gold Star families.”
Goldsmith cited a Facebook group called “Vets for Trump,” run by a pair of Macedonian conmen currently under FBI investigation for ties to the Russian government. In August, when the Macedonian government seized control of the page , it had “110,000 Facebook followers, and while publishing vile racist, xenophobic, and islamophobic content, increased their following to around 131,000 followers. In this time, they posted disinformation regarding voter eligibility, attacked Democratic presidential candidates, and promoted the candidacy of President Donald Trump,” he said.
Vlad Barash, science director at Graphika, a network analysis startup that looks at disinformation, testified, “These operations are surgically precise, targeting influential people and organizations in the veteran community. Veterans-focused publications have unwittingly published articles authored by false personas created by foreign intelligence services.”
Barash said several foreign governments had been targeting the U.S. veterans community: most prominently Russia and Iran, but also China and Saudi Arabia, the latter a longtime U.S. security partner.
“These operations show no signs of stopping,” he said, citing still-rising activity by Russia’s Internet Research Agency since 2016.
Representatives from Facebook and Twitter testified about recent efforts to crack down on scammers who target veterans. Kevin Kane, public policy manager at Twitter, said his company intends to solicit public feedback as it builds a policy to address synthetic and manipulated media. “We believe that we need to consider how synthetic media is shared on Twitter in potentially damaging contexts. We also want to listen and consider a variety of perspectives in our policy development process, and we want to be transparent about our approach and values,” Kane said.
Nathaniel Gleicher, head of security policy at Facebook, said his company is trying to train software to better detect the tricks that imposters use to fake accounts based on real people. But the efforts are now very limited. “If, during this process, we detect that an account may be impersonating such an individual, we flag it for human review,” Gleicher said. “We are still testing these processes, but they have helped us more quickly detect the creation of impostor accounts and remove them shortly after their creation, often before people even see them.”