How recent disinformation campaigns tied to Russia, Pakistan blended fake engagement with real life
Influence operations aren’t just about spreading fake news.
International governments and corporate public relations firms also are using inauthentic social media behavior to boost attention around real-world events that fit into foreign policy goals, a panel of experts said Tuesday at CyberTalks, a summit presented by CyberScoop. The propaganda campaigns are increasingly layered, with a number of examples that have relied on contract workers who may not have realized they were involved in an astroturfing effort.
In May, Facebook removed 30 pages, six groups, 83 accounts and 49 Instagram profiles that were linked to Yevgeny Prigozhin, a Russian oligarch who had distributed food baskets to impoverished communities in Sudan. The amplification of pro-Russia content appeared to be designed to improve the populations’ impression of Prigozhin, and thus the Kremlin, at a time when Russia is trying to keep Russian warships stationed at Port Sudan on the Red Sea, according to the Atlantic Council’s Digital Forensic Research Lab.
U.S. officials previously indicted Prigozhin in connection with the Internet Research Agency, Russia’s so-called internet troll factory that sought to influence public opinion before the 2016 presidential election.
“It shows exactly how influence operations can be used in addition to other priorities or in addition to other tactics,” said Graham Brookie, director of the DFRLab.
In another case, the network analysis firm Graphika examined a Facebook network apparently propped up by a Pakistani public relations firm. Accounts that took part in that effort masqueraded as independent media outlets, relying upon paid actors and freelance reporters to enhance their credibility.
“We see these disinformation actors turning to freelance recruiting platforms … to hire people to participate in these campaigns,” said Camille Francois, chief innovation officer at Graphika. “Sometimes you have unwitting freelancers who have no idea what they’re participating in. And sometimes you have freelancers who either know the context of what they’re doing, or are being asked to do things that are bizarre enough that they should ask.”
Measuring the amorphous impact of such ephemeral social media operations remains challenging.
Some efforts are focused on influencing the behavior of a single person or a small, specific group. Others, such as the Russian effort to suppress U.S. voter turnout in 2016, were more pervasive. It’s possible to measure how many social media viewers might have been exposed to a campaign, for instance, but knowing whether an individual person’s mindset changed is less quantifiable.
“Whoever has an answer for that is either overconfident or they’re lying. There isn’t an answer for that,” Brookie said. “Having a degree of saying, ‘Okay here’s a range of the things we can figure out and here are the things we can’t figure out’ is something that any credible research organization or person in this space … has to be upfront about. That’s to say, ‘Here are the limitations of what we know, especially about online activity.’”