Advertisement

Telecom behind AI-powered Biden robocall agrees to $1 million FCC fine 

As part of the settlement, Lingo Telecom must also implement stronger procedures to combat illegal spoofing and robocalling over its network. 
A view of the commission's hearing room before a hearing at the Federal Communications Commission in 2017. (BRENDAN SMIALOWSKI/AFP via Getty Images)

The Texas-based voice service provider that sent AI-generated robocalls of President Joe Biden to New Hampshire voters ahead of its Democratic presidential primary has agreed to pay a $1 million fine and implement enhanced verification protocols designed to prevent robocalls and phone number spoofing in a settlement with the Federal Communications Commission.

The fine represents half the amount the FCC was originally seeking in an enforcement action proposed against Lingo Telecom in May. Despite that, agency leaders characterized the settlement as a successful effort to defend U.S. telecommunications networks and election infrastructure from nascent AI and deepfake technologies.

New Hampshire Attorney General John Formella called the settlement “a major victory for the integrity of elections.”

“For New Hampshire voters who faced these misleading calls, this action is crucial in restoring trust and confidence in the electoral process,” Formella said in a statement

Advertisement

Democratic political consultant Steve Kramer, who was being paid by Dean Phillips’ presidential campaign at the time, masterminded the scheme. Kramer hired Life Corporation, a shadowy Texas-based political marketing firm, which used Lingo Telecom to transmit the calls to voters. According to the consent decree, Lingo told the FCC in May in response to a subpoena that they transmitted 3,978 calls to New Hampshire voters on Jan. 21, 2024. In a separate section, authorities said approximately 9,581 calls were sent to New Hampshire voters in the incident. 

All those calls were signed with “A-level attestations” — a designation that indicates the signing provider has a direct authenticated relationship with the customer, and the customer has the right to use the telephone number in the caller ID field. The company also claimed that Life Corporation “provided Lingo Telecom with a certification that Life Corporation would identify its customers and had verified that the telephone numbers used for all calls were associated with the customers.”

“Whether at the hands of domestic operatives seeking political advantage or sophisticated foreign adversaries conducting malign influence or election interference activities, the potential combination of the misuse of generative AI voice-cloning technology and caller ID spoofing over the U.S. communications network presents a significant threat,” Loyaan A. Egal, the FCC’s enforcement bureau chief, said in a statement. “This settlement sends a strong message that communications service providers are the first line of defense against these threats and will be held accountable to ensure they do their part to protect the American public.” 

In addition to the fine, the settlement requires Lingo Telecom  to follow regulatory protocols that were put in place in 2020 to ensure telecommunications carriers authenticate caller identities using their networks.

The protocols, known as STIR/SHAKEN, require carriers like Lingo to digitally verify and formally attest to the FCC that callers are legitimate and own the phone number they  display on  Caller ID. In the New Hampshire robocall case, Kramer and Life Corporation spoofed the phone number of Kathy Sullivan, a former state Democratic party official who was running a write-in campaign for Biden. 

Advertisement

The FCC cited Lingo’s inability to properly implement and enforce STIR/SHAKEN as a key failure in a February cease-and-desist letter, and again in May when the agency proposed a $2 million enforcement action. The company was also named in a civil lawsuit filed by the League of Women Voters and New Hampshire residents, seeking damages over the incident.

Per terms of the settlement, Lingo Telecom must hire a senior manager knowledgeable in STIR/SHAKEN protocols and develop a compliance plan, new operating procedures and training programs. They must also report any incidents of non-compliance with STIR/SHAKEN within 15 days of discovery.

Lingo Telecom did not return requests for comment.  “Every one of us deserves to know that the voice on the line is exactly who they claim to be,” FCC Chairwoman Jessica Rosenworcel said in a statement. “If AI is being used, that should be made clear to any consumer, citizen, and voter who encounters it. The FCC will act when trust in our communications networks is on the line.”

The New Hampshire incident exposed gaps in reporting and enforcing STIR/SHAKEN requirements, with some FCC officials blaming the agency for muddled compliance guidelines.

In May, FCC Commissioner Nathan Simington wrote in a letter that “this matter touches on the hot-button issue of deepfakes in elections, but fundamentally it’s about FCC oversight of STIR/SHAKEN.”

Advertisement

While Lingo Telecom failed to take adequate steps to verify the identity of the New Hampshire robocallers, Simington said the FCC has never specified formal standards for how carriers must comply with STIR/SHAKEN. He urged the FCC to implement formal rulemaking to provide telecoms with specific mandates on how to implement the standards and other know-your-customer rules.

“The problem for our action today is that Lingo probably complied with industry standards. We might deplore the laxity of these standards, but Lingo might well respond that they were in line with actions that had been repeatedly blessed by the FCC,” Simington wrote. “And today, by using an enforcement mechanism to declare new standards (however vague,) we are engaged in a back-door rulemaking through enforcement.”

Kramer, meanwhile, has claimed that he spearheaded the creation of the New Hampshire audio and robocalling scheme to shine a light on the dangers of nascent AI and deepfake technologies. However, he did not publicly acknowledge his role until it was revealed in an NBC News report in February.

In April, Kramer told CyberScoop that he was cooperating with the FCC and New Hampshire authorities “to not only satisfy a subpoena but in the future help them to prevent the kind of artificial intelligence that I’ve tried to prevent” — while claiming his actions had led to national awareness, legislative action and regulatory reform around deepfakes.

A month later, New Hampshire authorities charged him with 13 felony and 13 misdemeanor counts of impersonating a political candidate.

Derek B. Johnson

Written by Derek B. Johnson

Derek B. Johnson is a reporter at CyberScoop, where his beat includes cybersecurity, elections and the federal government. Prior to that, he has provided award-winning coverage of cybersecurity news across the public and private sectors for various publications since 2017. Derek has a bachelor’s degree in print journalism from Hofstra University in New York and a master’s degree in public policy from George Mason University in Virginia.

Latest Podcasts