The Federal Trade Commission proposed on Wednesday that Facebook be prohibited from profiting off of data it collects from minors, a move that comes in response to alleged violations of the company’s previous agreements with the agency to protect user privacy.
The proposed restrictions would come as part of an update to Facebook’s 2020 agreement with the agency and, once finalized, would be the third time that Facebook, now known as Meta, has been subject to privacy-related enforcement action by the agency.
“Facebook has repeatedly violated its privacy promises,” said Samuel Levine, director of the FTC’s Bureau of Consumer Protection, said in a statement. “The company’s recklessness has put young users at risk, and Facebook needs to answer for its failures.”
As part of the proposed changes, Meta would be prohibited from profiting from data it collects from users under the age of 18. It would also be subject to other expanded limitations, including limits on its use of facial recognition technology, and required to provide additional protections for users. These requirements would apply to all Meta products, including WhatsApp and its new virtual reality product Horizons.
The updated order would also stipulate that Meta would need written confirmation from the FTC’s independent assessor “that its privacy program is in full compliance with the order’s requirements and presents no material gaps or weaknesses,” before launching new products.
Meta spokesman Andy Stone called the proposal a “political stunt,” in a statement on Twitter.
“Let’s be clear about what the FTC is trying to do: usurp the authority of Congress to set industry-wide standards and instead single out one American company while allowing Chinese companies, like TikTok, to operate without constraint on American soil,” he wrote.
(TikTok is also under an FTC consent decree reached in 2019.)
The FTC alleges that Facebook violated two previous orders by continuing to give developers access to users’ private information until mid-2020, despite promising in 2018 to cease access if users had not used the app in the past 90 days.
The FTC also alleges that from 2017 through mid-2019 Facebook misled parents about controls for its Messenger Kids app and allowed children under certain circumstances to communicate with individuals outside of their contacts despite telling parents otherwise. The FTC says that in addition to violating the 2012 order, the misrepresentation violates the COPPA rule, which requires online services used by children 13 and under to get parental consent.
A Meta spokesperson noted that the independent assessment cited by the FTC in its order found that Meta provided a “high level of access and cooperation that we provided throughout the assessment process” and “called out our extensive investments in privacy compliance.”
The FTC previously filed complaints against Facebook in 2011 for misrepresenting its privacy practices and again in 2019 for violating that order after sharing data with Cambridge Analytica developers. The FTC under current chair Lina Khan also brought an antitrust case against Facebook, which was later thrown out in court. Khan’s particularly aggressive approach against Big Tech has attracted scorn from Republican lawmakers who say the agency has overstepped its authorities.
Experts note that the agency’s decision to go directly after how Meta monetizes its data is an uncommon approach for a privacy-related order.
“It’s unique outside of the context of fraud,” said Cobun Zweifel-Keegan, managing director of the International Association of Privacy Professionals. Zweifel-Keegan also called the FTC’s proposed order to prevent Meta from launching any new products until its privacy program is considered fully compliant also unprecedented.
Groups that have raised concerns about how Meta uses children’s data praised the action.
“The action taken by the Federal Trade Commission against Meta is long overdue. For years, Meta has flouted the law and exploited millions of children and teens in their efforts to maximize profits, with little care as to the harms faced by young users on their platforms,” said Josh Golin, executive director at Fairplay. “The FTC has rightly recognized Meta simply cannot be trusted with young people’s sensitive data and proposed a remedy in line with Meta’s long history of abuse of children.”
Top technology watchdogs in Congress applauded the move, noting the need for legislative action.
“I’m grateful to the FTC for their diligent work to safeguard the privacy of Americans’ data,” Rep. Yvette Clarke, D-N.Y. wrote in a statement to CyberScoop. “All social media companies have a duty to protect their user’s data, and the continued privacy violations we see from platforms today only intensifies the need for comprehensive data privacy legislation.”
Sens. Edward Markey, D-Mass, and Bill Cassidy, R-La., who on Wednesday reintroduced their legislation to update COPPA, also weighed in.
“Today, the FTC has affirmed what we’ve been saying for years: Meta has already violated the law, and now it’s failing to comply with the terms of its privacy probation,” they wrote. “Clearly, though, Congress must also act if we are to put an end to these egregious privacy violations and protect young people from the invasive and damaging Big Tech business model.”
Meta will have an opportunity to respond to the claims before the update to the 2020 order is finalized.
The FTC voted 3-0 to issue the proposal. However, Democratic Commissioner Alvaro Bedoya expressed concerns that there “are limits to the Commission’s order modification authority.”
“Here, the relevant question is not what I would support as a matter of policy,” Bedoya wrote in a statement. “Rather, when the Commission determines how to modify an order, it must identify a nexus between the original order, the intervening violations, and the modified order.” Bedoya wrote he had concerns about “whether such a nexus exists” for prohibiting Facebook from profiting off of children’s data.