Advertisement

Here’s how the FTC plans to enforce the Take It Down Act

The commission will dole out  hefty fines and promises investigations for Take It Down Act violators. Experts say questions remain around the agency’s resources and priorities.  
Listen to this article
0:00
Learn more. This feature uses an automated voice, which may result in occasional errors in pronunciation, tone, or sentiment.
This week, FTC Chair Andrew Ferguson sent letters to private-sector companies detailing how the commission intends to police compliance once enforcement begins. (Getty Images)

The Federal Trade Commission is set to begin enforcing a key provision of the Take Down Act on May 19, requiring websites and online services to remove nonconsensual deepfake media within 48 hours after a victim’s notice—or risk fines and FTC investigation.

The law, passed by Congress last year, allowed law enforcement to immediately prosecute individuals who create and post such content online. But platforms and websites that host the material were given a yearlong runway to build out their reporting and takedown system. Under the enforcement regime taking effect, businesses that fail to remove flagged media within the 48-hour notification window could face fines and an investigation from the FTC.

This week, FTC Chair Andrew Ferguson sent letters to private-sector companies detailing how the commission intends to police compliance once enforcement begins. The FTC set a maximum civil penalty of – $53,088 per violation for companies that don’t take down content as required, and Ferguson’s letter outlines other requirements, including that companies make it easy and convenient for users to submit takedown requests.

“We stand ready to monitor compliance, investigate violations, and enforce the Take It Down Act,” Ferguson said in a statement. “Protecting the vulnerable—especially children—from this harmful abuse is a top priority for this agency and this administration.”

Advertisement

Ferguson’s letter sheds new light on how the FTC will enforce content takedowns under the law.  Both nonconsensual intimate imagery posted online using real photos of other individuals as well as AI-generated or modified “digital forgeries” would be considered violations.

Companies must also make it easy for victims without accounts to report potential violations, details their reporting and removal program on their website “in plain language” and provide “clear and conspicuous” notice to users about how to request removals.

According to the FTC, the law covers websites, apps, social media, image or video sharing services and gaming platforms. Ferguson’s letters were addressed to a who’s who of tech and social media companies, including Amazon, Alphabet, Apple, Automattic, Bumble, Discord, Match Group, Meta, Microsoft, Pinterest, Reddit, SmugMug, Snapchat, TikTok and X.

Earlier this year,Grok, the AI service that X users have access to, was used to flood the social media site with nonconsensual, sexualized deepfakes of real people. Elon Musk, X’s owner, initially brushed off critics but has since been hit with multiple criminal and civil investigations stemming from the incident, as well as lawsuits and calls from some world leaders to ban the app entirely.

 The FTC is also recommending that companies implement hashing technologies “to prevent the reappearance of intimate content you already removed from your platform” and share their findings with nonprofits like the National Center for Missing and Exploited Children and StopNCII.org to track across other parts of the internet.

Advertisement

Becca Branum, director of the Free Expression Project at the Center for Democracy and Technology, told CyberScoop that some elements of the FTC’s approach – like requiring clear and simple reporting options for victims – aligns with best practices established by civil society groups.

But she also said the FTC’s role under the Take It Down Act is materially different from anything the commission has done before. The sheer scale of enforcement and monitoring will require human and technical resources on par with those of major social media companies.

“I’m very concerned about the FTC and its ability to fairly enforce this law,” said Branum. “They are now in the business of regulating content moderation. That is hard work and not something they’re used to doing.”

Some legal and privacy experts pointed to the large financial penalties set by the FTC as a sign that policymakers are looking to put real teeth behind enforcement. Those penalties could pile up quickly if a business is hosting or publishing multiple copies of the same flagged media and declines to remove it within two days.

“For covered platforms, compliance with the Act is critical given the FTC’s emphasis on enforcement – reflecting White House priorities – and potential civil penalties up to $53,088 per violation,” wrote privacy attorneys Duane Pozza and Ian Barlow.

Advertisement

But Branum said the hefty fines also emphasize “just how much incentive will be in place for platforms to take anything that comes down the complaint line.”

While the Take It Down Act is designed to force companies to investigate claims and remove violating content, the regulatory and financial incentives push them to simply remove almost all content reported by default. That approach, which many of the same tech companies have taken under laws like the Digital Millenium Copyright Act, can be exploited by bad faith actors seeking to shut down legal speech or content online.

“If you think there’s any given post [where] if you ask an attorney is it worth $53,000 for me to keep this post up, the answer is always going to be taken it down,” Branum said. “I can’t imagine any service wanting to risk that type of fine on edge cases or anything they can’t verify or account for within 48 hours.”

Derek B. Johnson

Written by Derek B. Johnson

Derek B. Johnson is a reporter at CyberScoop, where his beat includes cybersecurity, elections and the federal government. Prior to that, he has provided award-winning coverage of cybersecurity news across the public and private sectors for various publications since 2017. Derek has a bachelor’s degree in print journalism from Hofstra University in New York and a master’s degree in public policy from George Mason University in Virginia.

Latest Podcasts