Advertisement

Disinfo group Spamouflage more aggressively targeting U.S. elections, candidates

Graphika report finds the Chinese-linked group has been creating American personas online and spreading content designed to denigrate both parties and candidates. 
US Vice President and 2024 Democratic presidential candidate Kamala Harris speaks on the fourth and last day of the Democratic National Convention (DNC) at the United Center in Chicago, Illinois, on August 22, 2024. Harris was among multiple politicians denigrated by the Chinese-linked group Spamouflage. (Photo by ANDREW CABALLERO-REYNOLDS / AFP) (Photo by ANDREW CABALLERO-REYNOLDS/AFP via Getty Images)

A prolific disinformation group linked to the Chinese government has stepped up its efforts to impersonate Americans this year in an attempt to degrade and diminish U.S. politicians and institutions in the eyes of voters.

Spamouflage — also known as Dragonbridge, Taizi Flood and Empire Dragon —  produces high-volumes of spammy, inauthentic content online in an effort to influence political and public opinion in targeted countries.  While typically not focused on elections or candidates in the U.S., Graphika researchers say that since mid-2023,  associated accounts  are increasingly mimicking American voters and criticizing politicians and candidates.

In a report released Tuesday, Graphika identified a few accounts — 15 on X, one on TikTok, one on Instagram and one YouTube channel —  using AI-generated profile pictures, patriotic imagery and American identities to pose as disaffected U.S. voters. Much of the content aimed to undermine U.S. politics  by depicting it as corrupt.  The accounts highlighted divisive topics like the war in Gaza, homelessness, gun control and racial inequality as examples of how the U.S. political system had failed, intending to discourage voter turnout. 

“It shows that Chinese influence operations targeting the U.S. are evolving and they’re engaging in these more advanced, deceptive behaviors,” Jack Stubbs, Graphika’s chief intelligence officer, told CyberScoop. “And it shows that they are directly targeting these organic but really sensitive social rifts as part of their effort to disrupt and influence the way Americans talk about politics and domestic social issues ahead of an important election.”

Advertisement
Source: Graphika

In some cases, the accounts posted messages, imagery and videos that targeted specific candidates, such as Kamala Harris or Joe Biden. They also amplified anti-Democratic Party content shared by authentic conservative accounts.

While the content sometimes leaned into pro-Donald Trump or conservative-aligned messaging, they also at times criticized the former president. According to the report, these accounts mostly “did not explicitly assume or reference a MAGA identity.” Graphika believes the goal of the campaign was “to exacerbate U.S. social divisions and portray the U.S. as a declining global power with weak leaders.”

Source: Graphika

That conclusion is in line with how U.S. intelligence officials have described Chinese-led information operations targeting the 2024 elections. In a July briefing with reporters, officials at the Foreign Malign Influence Center, a unit within the Office of the Director of National Intelligence responsible for notifying the public about foreign attempts to interfere in U.S. elections, said China was moving “cautiously” with regard to this U.S. election cycle, with leaders in Beijing believing that no matter which party or presidential candidate wins, the U.S. is likely to continue its policies confronting China on the world stage.

Advertisement

“Right now, our assessment is that China doesn’t perceive a benefit in supporting either candidate and either party,” the ODNI official said.

Intelligence officials also said they had observed signs that foreign governments were “getting better at hiding their hand” and masking their involvement in online influence operations, outsourcing the work to third-party commercial firms and finding new ways to launder their propaganda through American citizens.

That includes at least one instance where the People’s Republic of China has “collaborated with a China-based technology company to enhance the PRC’s covert influence operations, including to more efficiently create content that also connects with local audiences,” an ODNI official said.

Deep involvement, shallow impact

Like most Spamouflage campaigns, this one was largely ineffective at generating organic engagement from U.S. audiences on social media platforms.

Advertisement

It’s part of a larger pattern observed by experts and researchers around the lack of success Spamouflage has had breaking through with target audiences, despite dedicating significant time and resources to their online operations over the past five years.

In addition to Graphika, Microsoft, Meta, Google, the Institute for Strategic Dialogue and other entities have reported extensively on Spamouflage operations targeting audiences in the U.S., Europe, Taiwan and other countries over the past year. Virtually all of them found that the group’s posts appear to get little organic engagement on social media.

Graphika noted this account cluster  is an improvement over Spamouflage’s usual from the low-quality, high-volume output, , yet operators continue to struggle with language and cultural nuances. .  Posts often include awkward grammar and sentence structures, or overt references like “#American,” which likely attracts more skepticism than organic engagement. 

Source: Graphika

One notable exception appears to be a series of videos posted by “The Harlan Report,” a supposed pro-Trump, pro-MAGA account. It repurposed videos from legitimate news outlets and disparaged Democratic politicians including Biden, Harris and former House Speaker Nancy Pelosi, while overlaying the videos with custom graphics, branding and messaging. The approach appears to mimic the kind of short-form video content that is often used by online marketers and influencer accounts to create viral content for younger online audiences.

Advertisement

Operators behind The Harlan Report managed to find real engagement for these videos on X and TikTok, with some receiving over a million views.

Source: Graphika
Source: Graphika

Stubbs said that in addition to language barriers, Chinese operators behind Spamouflage accounts “have learned to do this in a completely closed information environment.” In many cases, their foundational experience navigating the internet and social media was in the highly restricted, tightly regulated confines of Chinese cyberspace. That experience may leave them ill-prepared when it comes to generating viral or influential content on U.S. and Western social media platforms.

“The way the Chinese internet works and is structured is completely different to the way that you and I and other folks in the West are having conversations online,” Stubbs said. “And so perhaps it’s not as much of a surprise that they struggle to navigate [an open] information environment where people can say, do and — rightfully — think what they want to think.”

Advertisement

The lack of measurable effect is a reminder that the public and media should not rush to overhype the impact of groups like Spamouflage. Even prominent disinformation researchers have expressed concerns about disinformation campaigns potentially being more impactful after they’re exposed and receive widespread coverage than they were as a clandestine operation.

But in a political environment where some lawmakers, pundits and activists have questioned the legitimacy of disinformation research, while others have wildly exaggerated its impact, research on groups like Spamouflage “informs the public conversation about what is and isn’t happening in terms of foreign influence efforts tied to the U.S. election,” Stubbs said.

“The fear of foreign influence efforts is significant and the threat is real — it’s a legitimate risk,” he added. “But we need to make sure that those conversations are rooted in what is happening and what we can prove and analyze objectively, and not risk straying into speculation about what could be happening or might be happening.” 

Derek B. Johnson

Written by Derek B. Johnson

Derek B. Johnson is a reporter at CyberScoop, where his beat includes cybersecurity, elections and the federal government. Prior to that, he has provided award-winning coverage of cybersecurity news across the public and private sectors for various publications since 2017. Derek has a bachelor’s degree in print journalism from Hofstra University in New York and a master’s degree in public policy from George Mason University in Virginia.

Latest Podcasts