Most apps used in US classrooms share students’ personal data with advertisers, researchers find

Apps custom-designed for schools are sending staggering amounts of data to Facebook and Google, researchers found.

A whopping 96% of the apps used in U.S. K-12 schools share children’s personal information with third parties — including advertisers — often without the knowledge or consent of users or schools, according to a study published Tuesday.

The research, conducted by nonprofit Internet Safety Labs, highlights how the race by schools to increase their tech arsenals has placed students — and parents — in a precarious position of not knowing where personal information is ending up.

Researchers looked at 13 schools in every state, leading to a total of 663 schools representing nearly half a million students. They found that most schools had more than 150 approved technologies for classrooms, a dizzying number for parents and school administrators to monitor. One school had as many as 1,411.

The report follows previous research from the group, formerly known as the Me2B Alliance, finding hundreds of advertisers collected valuable student data from a website specializing in school sports data.


The latest report highlights the exposure of student data to advertisers through school-approved technology is a widespread problem. Nearly a quarter of the apps recommended or required by schools included ads and 13% included retargeting ads, which allow digital advertisers to pinpoint visitors based on previous website visits. Researchers note that this risks student data being pulled into advertising networks without any way for schools or parents to find out. Several states including California ban using student data for this kind of targeting.

The staggering amount of advertising “should alarm everyone,” Joel Schwarz, a cybersecurity expert and cofounder of the Student Data Privacy Project, wrote to CyberScoop in an email. Under the Family Educational Rights and Privacy Act, schools are allowed student data for educational purposes, but “it would take a huge stretch of the imagination to interpret targeted advertising to be an educational purpose,” Schwarz wrote.

He said that many of the Internet Safety Lab’s findings, including the vast number of technologies approved by schools, are consistent with what parents who work with his group have found in their own school districts. 

The use of education technology providers exploded when the COVID-19 pandemic forced classes online. But the quick adoption also meant that many school districts entered into agreements without much scrutiny or public discussion. Center for Democracy and Technology researcher Elizabeth Laird concluded in a report last month of local education agencies that “staffing and transparency efforts have not kept pace with their large investments in education technology and expanded data collection.”

One factor driving the data collection is that many of the technologies approved by schools for student use were not designed specifically for educational purposes — and in many cases not designed with children in mind. Such apps included The New York Times app, Duolingo and Amazon shopping.


“This whole idea of like, some tech is for kids, some tech isn’t, I think needs to be called into question in an honest and open way to acknowledge that schools are going to be using this technology,” said Lisa LeVasseur, Internet Safety Labs’ executive director. “What are the answers, to keep them safe, to keep everybody safe?”

Across the board, Google was the most common third-party that received data from the apps used in schools. Nearly 70% of all apps were observed sending data to Google and 70% included Google software developer kits, an internal software component. Some of this is driven in part by Google’s dominance as a hardware and software supplier for K-12 schools thanks to the ubiquity of Google Classroom.

Researchers did not analyze what data apps sent to third-party SDKs, but in general, such pieces of code are used to send crash reports and data analytics used by advertisers.

Reasonably, it might make sense for an app like The New York Times, which wasn’t designed for children or education, to be set up to send data to advertising firms. But that’s not necessarily the case for apps that are custom designed for schools, which researchers said were the worst offenders when it comes to sharing kids’ data with Facebook, Amazon and Twitter.

Customized apps for schools tended to be less safe than the general pool of apps studied, researchers found. For instance, researchers found that 81% of custom apps requested access to location information, a slightly larger number than 79% of all apps requesting that information. And 69% of custom apps accessed “social information,” such as calendars and contacts.


Recently regulators and privacy experts have begun to push back against the ed tech industry’s surveillance tactics.

The Federal Trade Commission in May issued guidance that education technology companies bound by federal children’s privacy protections for children under 13 were prohibited from using the personal information collected from a child for any commercial purpose, even if the school authorized it to collect the information.

Tonya Riley

Written by Tonya Riley

Tonya Riley covers privacy, surveillance and cryptocurrency for CyberScoop News. She previously wrote the Cybersecurity 202 newsletter for The Washington Post and before that worked as a fellow at Mother Jones magazine. Her work has appeared in Wired, CNBC, Esquire and other outlets. She received a BA in history from Brown University. You can reach Tonya with sensitive tips on Signal at 202-643-0931. PR pitches to Signal will be ignored and should be sent via email.

Latest Podcasts