Advertisement

Senate Intel chair warns confluence of factors make election threats worse

Sen. Mark Warner said influence operations are easy and cheap, and their social media audience is more willing to believe them.
U.S. Sen. Mark Warner (D-VA) leaves the U.S. Capitol on July 11. (Photo by Tierney L. Cross/Getty Images)

Misinformation and disinformation threats are being exacerbated this election season by artificial intelligence, legal battles, the continued low cost of influence operations and Americans’ increased willingness to believe outlandish things, Senate Intelligence Chairman Mark Warner, D-Va., said Thursday.

Speaking at the Ronald Reagan Presidential Foundation and Institute, Warner said that things had gotten better in the battle against such election threats in some ways, including via ever-improving coordination between key federal agency leaders and less interference than expected in other countries’ elections, such as in France or European parliamentary elections.

One way they’ve gotten worse, however, is in comparison to the 2016 election cycle when Russian influence operations proliferated, Warner said.

“Oftentimes, the Russians had to plant the false implication and then elevate it,” he said of the 2016 race. “Now they can simply elevate or promote” pre-existing false narratives, he said. The reason, he said, is that “Americans believe a lot more crazy stuff” simply because they saw it on the internet.

Advertisement

There’s also a low barrier for anyone who wants to meddle in elections, Warner said. “Foreign adversaries know disinformation and misinformation is cheap, and it works,” he said.

Furthermore, artificial intelligence has enhanced the scale and speed with which adversaries can spread false narratives, Warner said.

Then there have been legal hitches. The U.S. Supreme Court ultimately rejected a challenge brought by GOP attorneys general and social media users accusing government agencies — including the FBI and Cybersecurity and Infrastructure Security Agency — of censoring conservatives on social media platforms in the name of combating disinformation and misinformation.

But Warner said there was a “seven-to-eight-month chilling effect” while the case was working its way through the system, during which agencies were alternately forbidden from, or had halted, communicating with social media companies about mis- and disinformation.

The Justice Department’s Inspector General nonetheless recommended this week that the agency should develop a way to inform the public about its procedures for notifying social media companies about foreign influence campaigns in a manner that doesn’t compromise First Amendment rights.

Advertisement

Speaking at the same event, Rep. Brad Wenstrup, D-Ohio, said he didn’t expect election threats to change significantly in the approximately 100 days before Nov. 5. “I think some of them we’re already seeing, we just might see it accelerate,” predicted Wenstrup, who chairs the House Intelligence Oversight and Investigations Subcommittee.

But one thing that could shift the kind of mis- and disinformation abuses between now and the election is the emergence of Vice President Kamala Harris as the presumptive Democratic nominee, said Kat Duffy, a senior fellow for digital and cyberspace policy at the Council on Foreign Relations.

“I fully expect that we’re going to see just an absolutely extraordinary escalation of attacks on her,” said Duffy, citing the fact that Harris is a woman with Black and South Asian heritage. “She is like a trifecta for threats.”

U.S. intelligence officials recently said that Russia is the most active adversary in this election and that the Kremlin once again prefers former President Donald Trump in this race.

And at least one official at the world’s biggest social media platform maintained that users have gotten savvier, not more gullible, over time. 

Advertisement

“I think the key difference between now and 2016 is that people are a bit more skeptical, at least when it comes to the types of activities [influence campaign operators] were doing in 2016, to say, ‘Hey, this isn’t real,’” said Lindsay Hundley, who works on global threat disruption at Meta, owner of Facebook and Instagram.

Latest Podcasts