Advertisement

After a sleepy primary season, Russia enters 2024 U.S. election fray

Russian influence operations have picked up steam in the past two months, according to a Microsoft report.
The street artist Andrea Villa installs an anti-war poster depicting Israeli Prime Minister Benjamin Netanyahu, Iranian President Ebrahim Raisi, Russian President Vladimir Putin and U.S. President Joe Biden on April 16, 2024 in Turin, Italy. (Photo by Stefano Guidi/Getty Images)

Russian influence operations targeting the 2024 U.S. elections have ramped up in the past 45 days, using Telegram as a primary distribution channel to spread propaganda to influence debate over Ukraine policy, according to new research from Microsoft’s Threat Analysis Center.

The rise in observed activity represents a late start for Moscow compared to efforts in 2020 and 2016, something Microsoft attributed to an uncompetitive presidential primary season that saw Donald Trump and Joe Biden cruise to their respective nominations with minimal resistance. Unlike in 2016 and 2020, when one or both parties were ensconced in heavily contested and contentious intraparty primaries, this cycle presented little motivation and fewer opportunities for foreign nations to move the needle with meddling.

That dynamic has changed as the race shifts to the general election, and Microsoft has tracked multiple groups targeting the U.S. elections and 70 different Russian-associated “activity sets” worldwide pushing content and messaging in English, Spanish, French, Arabic, Finnish and other languages designed to degrade international support for Ukraine, portray President Volodymyr Zelenskyy as the head of a corrupt state and diminish the appetite of Western governments to further fund the Ukrainian war cause.

“Oftentimes when we’re watching the activity that comes out of these actor sets, some people characterize it as they want to make chaos in the U.S. or they want to create problems for democracies. … With their messaging in regards to election 2024, it is absolutely about Ukraine policy,” said Clint Watts, general manager of Microsoft’s Threat Analysis Center.

Advertisement

Much of the content, including fake videos, news articles or explosive claims from sources identifying as whistleblowers or citizen journalists, is initially seeded on Telegram, which Watts said has become a primary distribution channel for Russian propaganda efforts since the start of the Ukraine invasion.

Two groups in particular, tracked by Microsoft as Storm 1516 and Storm 1099, have relied on this approach, posting anti-Ukraine content on purpose-built Telegram channels that are then picked up by seemingly unaffiliated news outlets and websites with names like “DC Weekly” and “Miami Chronicle” that pose as local sources but are actually Russian cut-outs.

Storm 1099 — otherwise known as Doppelganger — uses outlets that specifically target the United States, with names like “Election Watch” and “50 States of Lie.” These sites play up internal domestic divisions in U.S. society and politics, warning that “American elections have long since lost their democratic character” and that the nation faces an “unprecedented number of rebellions that could split the country in two.”

Watts said these Telegram channels function as “a bridge by which [content] gets pushed into and reposted and amplified in social media, such that it moves from one social media platform to another.”

Advertisement

That’s a marked change from eight years ago, when platforms like Facebook and Twitter had built up mass general audiences that allowed for direct targeting in influence campaigns. Today, audiences are far more fragmented across different social media, something that has led to Russian groups using Telegram as a staging ground for content that can be micro-targeted to different audiences on different platforms.  

Microsoft’s assessment that Russian operatives are laser focused on influencing Ukraine policy is backed up by several other sources, including Mandiant, which found a Russian hacking group targeting political parties in Germany in an effort to gain insights into policymaking on Ukraine. Rob Joyce, the former director of cybersecurity at NSA, told reporters in March that “Russia is very motivated to make sure that the focus on support to Ukraine is disrupted.”  

Russia has been by far the most prolific actor in the election interference space this cycle, with Microsoft also tracking activity from China, Iran and other countries, but not at nearly the same cadence or intensity. 

Moscow’s disinformation operations continue to leverage both online and offline methods to spread damaging narratives. A campaign collectively known as the NABU leaks was carried out by Andrii Derkach, a former Ukrainian member of Parliament, in the lead-up to the 2020 elections. Those efforts were meant to discredit the Ukrainian National Anticorruption Bureau and spread rumors of current and former U.S. officials engaged in corruption, money laundering and political influence in Ukrainian politics.

Derkach, who was sanctioned by the U.S. Treasury Department for the NABU leaks campaign, indicted in 2022 for efforts to covertly influence the 2020 election and stripped of his Ukrainian citizenship in 2023, had gone quiet since the start of the Ukraine invasion. 

Advertisement

However, he reemerged in January in an interview with a Belarusian media personality, reviving claims from the NABU leaks and seeking to implicate Biden in Ukrainian corruption at the same time that House Republicans were pursuing an impeachment inquiry against the president under the auspices of similar corruption claims.

A key witness in that inquiry, an American-Israeli citizen named Alexander Smirnov who had previously served as a confidential human source for the FBI, was indicted in February on charges of lying to the FBI about contacts between the Biden family and Ukrainian energy company Burisma. According to the indictment, Smirnov said in interviews with the FBI that the new information was gleaned from conversations with high-level Russian government officials.

There doesn’t yet appear to be solid evidence of Russian activity setting up the kind of hack-and-leak campaign that upended the 2016 U.S. presidential race. But Watts said in order to properly prepare such a campaign, Russian hackers “need to be hitting targets in the next 60 days” to leave enough time to leak content ahead of November.

Microsoft’s report was published on the same day the Cybersecurity and Infrastructure and Security Agency, the Office of the Director of National Intelligence and the FBI released joint guidance on foreign malign influence operations against critical and election infrastructure.

The guidance reinforced some aspects of the Microsoft report, such as identifying Russia, China and Iran as the most active players targeting U.S. elections and highlighting potential tactics, like voice cloning and the use of proxy media to deceive target audiences. Many of these same tactics, the agencies note, can also be leveraged by domestic disinformation actors.

Advertisement

The agencies’ guidance for mitigating such threats largely echo what some state and local officials have told CyberScoop they’re already doing, including training election officials on the threats and capabilities of AI, embedding digital watermarking tools in official documents, using safe words to authenticate voice conversations and holding transparency sessions with the public and media on voting technology and administration.

AI becomes another tool in the influence toolbox

Both Russia and China have been observed leveraging AI-generated media in their influence campaigns over the past year. Most notably, a Chinese group known as Spamouflage used the tools to pump out a steady stream of AI-generated memes and deepfake audio and video targeting different candidates and parties in the lead-up to elections in Taiwan.
However, while there is evidence that foreign influence groups continue to experiment with incorporating the technology into their campaigns, so far the fear that fully generated deepfake videos will cause mass deception among voters “has not borne out,” according to Microsoft’s findings.

In many cases, “audiences gravitate toward and share disinformation” that “involve simple digital forgeries consistent with what influence actors over the last decade have regularly employed,” the report stated, and such cheapfake content still regularly outpaces fully synthetic generative AI videos in terms of views and shares.

One area that does show promise is voice cloning — either to generate fake audio phone calls and messages or to overlay with authentic video footage. This tactic was used in the U.S., when a Democratic operative tied to the Dean Phillips presidential campaign  used deepfake technology in January to impersonate Biden and target his supporters with messages urging them to stay away from the polls in the New Hampshire primary. Similar incidents have been observed in Slovakia and Taiwan.

Advertisement

Not surprisingly, Microsoft’s research has found that the more familiar a person is to the general public, the less effective deepfakes tend to be. One scenario where such audio could be particularly effective is in personal or private settings, such as a phone call or direct message, where the target is isolated and more vulnerable to deception.

That echoes what several state election officials and election security experts have told CyberScoop in previous interviews.

Updated April 17, 2024: This article has been updated to include a description of newly released U.S. government guidance on how to secure election infrastructure.

Corrected April 17, 2024: An earlier version of this article reported that Alexander Smirnov was indicted on charges of being a Russian agent, when he was in fact indicted for lying to the FBI.

Derek B. Johnson

Written by Derek B. Johnson

Derek B. Johnson is a reporter at CyberScoop, where his beat includes cybersecurity, elections and the federal government. Prior to that, he has provided award-winning coverage of cybersecurity news across the public and private sectors for various publications since 2017. Derek has a bachelor’s degree in print journalism from Hofstra University in New York and a master’s degree in public policy from George Mason University in Virginia.

Latest Podcasts