State legislators aren’t waiting for Congress to regulate children’s online privacy
After a year of stalled efforts in Congress to pass expanded children’s privacy legislation, states are plowing ahead with their own efforts to address growing concerns about how tech companies collect and use children’s data.
At least five states, including New Jersey, Oregon, Texas, Virginia and West Virginia, are considering children’s privacy-related legislation while several other states are set to reintroduce legislation that expired last session.
“I think kids’ and teens’ privacy is not an issue that is going away. Legislators know that there are specific risks that the demographic faces,” said Cobun Zweifel-Keegan, managing director of the International Association of Privacy Professionals.
While experts say the legislative trend should put tech companies on notice, they also warn that the momentum could be weakened if states fail to address concerns about how tech companies should comply with these laws and how they will be enforced.
Both Oregon and New Jersey’s bills follow California’s lead in passing regulations requiring companies whose services may reasonably be accessed by anyone under 18 to consider the “best interests of children” in designing their products.
The California law, known as the “Age Appropriate Design Code Act,” garnered significant attention as the first legislation of its kind requiring companies to consider the best interest of users under 18. Children’s safety groups, including CommonSense, applauded the California effort.
Not all of that attention has been positive, however. Civil liberties groups such as the Electronic Frontier Foundation raised concerns that the law is overly vague in defining “the best interest of the children.” Last month, NetChoice, an industry group whose members include Meta, TikTok and Google among others, sued to block the law, arguing it violates the First Amendment.
At the federal level, the Children’s Online Privacy Protection Act (COPPA) requires operators of websites directed at users under 13 to get parental consent before collecting the data of children under 13.
The California Age Appropriate Design Act and its emulators take a more expansive approach to which businesses are covered and extend protections to users 18 and under. Moreover, where COPPA requires that companies have actual knowledge they were collecting data from users under 13 in order for a violation to be brought, California’s law follows what is known as a “constructive knowledge” standard, which means a company can be found in violation if it knew or should have known an offense occurred.
That means that a broader range of companies will be subject to California’s law.
In order to meet these requirements, California’s law gives businesses the choice between estimating users’ ages or applying the same standard of privacy it would for children to all users. However, estimating a user’s age can be difficult. In this case, major platforms such as YouTube, Meta and TikTok that have already had to make changes to their services to comply with the U.K.’s Age Appropriate Design Code, which influenced California’s law, may actually be at an advantage.
For other businesses, it might not be that easy.
“In my experience talking to companies, it actually might be less of a burden for some companies to take that other path … and simply apply some of the privacy-protective default settings and other requirements in the [design code] to all users because age dating or verification has been such a challenge,” said Zweifel-Keegan.
However, experts say that the question of who gets to decide the best interest of children is a difficult one for any company to answer. While COPPA has historically placed power with parents to make that decision, California’s design code puts the onus on companies to determine whether their products meet that best interest. States will be responsible for enforcing whether companies meet those standards, but without rulemaking or additional guidance, it will be hard for companies to interpret what best interest means, said Bailey Sanchez, policy council at the Future of Privacy Forum.
For instance, the California law states that businesses may not use children’s personal information in a way that “is materially detrimental to the physical health, mental health, or well-being of a child.”
The requirement takes aim at the impact of social media companies on children’s mental health, something that has spurred the investment of millions of dollars in federal research as well as a recent lawsuit by Seattle schools against major social media companies.
But California hasn’t given any further guidance on what exactly would be detrimental. And what California views as detrimental may be very different than a state like Texas.
“Something that we think might happen is that a red state might have a different interpretation of what is in the best interest of the child and what is age-appropriate than a blue state like California, for example,” said Sanchez.
California’s law was amended in November to clarify that companies should “take into account the unique needs of different age ranges,” splitting up minors into five different developmental stages. However, it’s unclear how the state would enforce these distinctions, said Sanchez.
Another area of concern is the law’s requirement that companies estimate the age of users in order to comply with the law. While both California and Oregon’s design codes prohibit companies from using the personal information collected to estimate age for other purposes, some critics argue that the requirement could actually harm children’s privacy by forcing companies to increase the data they collect on children and turn to invasive age estimation technologies that require biometrics or other forms of identification.
“This law comes with incredible privacy harm,” said Carl Szabo, vice president and general counsel at NetChoice, the group suing to stop the law. He compared it to concerns over Louisiana’s recently enacted law requiring adult content providers to verify that users are over 18.
But whereas the Louisiana law only applies to porn, California’s law will cover a wide array of the internet, including social media websites, games and any other businesses that may appeal to underage users. “In order to verify that the person sitting at the terminal is either 18 and over or under 18, that requires incredible amounts of data collection,” said Szabo.
Both the California law, which goes into effect in 2024, and the Oregon bill, which would go into effect in 2025, set up task forces to provide guidance on how to comply with the laws.
Children’s privacy legislation is also expected to move ahead at the federal level. In a recent op-ed in the Wall Street Journal, President Biden urged Congress to unite on privacy legislation and stop Big Tech from “pushing content to children that threatens their mental health and safety” among other harms. Two bills that failed to pass Congress last year, the Kids Online Safety Act and an updated version of COPPA, are expected to be reintroduced this year.
Many of the principles in California’s law, such as minimizing data collection and a ban on manipulative design patterns, are also issues that have gained traction with the Federal Trade Commission. The consumer watchdog agency recently reached a record $520 million settlement with Epic Games alleging it had violated COPAA and was using “dark patterns” in its billing practices. The settlement requires Epic Games to adopt default privacy settings turning off voice and text communications for children and teens, a first-of-its-kind provision by the FTC.
“Regardless of what happens with this legislative session, it’s really clear that children’s and teens’ privacy and really teen privacy is something that companies need to be thinking about,” said Sanchez.