Advertisement

Under fire from privacy advocates, Apple delays controversial photo scanning plan

The company said it plans to make improvements to the controversial features.
People visit the Apple store in the Oculus Mall in Manhattan on July 29, 2021 in New York City. (Photo by Spencer Platt/Getty Images)

Apple is delaying plans for a contentious U.S. update that the company said it would use to detect child sex abuse images as they’re uploaded to iCloud Photos from iPhones, the company announced on Friday.

The reversal comes less than a month after Apple announced the change, during which the company came under heavy criticism from privacy advocates who contended it could enable government surveillance requests. Apple also delayed the rollout of a feature that would scan iMessage images sent or received by children for sexually explicit materials, and notify parents if the children are young enough.

The plans stood to jeopardize Apple’s positioning as the tech giant that most valued privacy, but came as it faced pressure from governments and child advocacy groups to do more to combat child sex abuse materials. The update came more than five years after Apple refused to create new software that would have enabled U.S. authorities to break encryption on an iPhone used by a terrorist who killed 15 people and injured 22 others in a mass shooting in San Bernardino, California.

“Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features,” the company said in a brief statement.

Advertisement

It did not specify a timetable.

Critics said the client-side scanning plans amounted to a backdoor into its systems that could lead to further abuses, and threatened end-to-end encryption.

“Though these capabilities are intended to protect children and to reduce the spread of child sexual abuse material (CSAM), we are concerned that they will be used to censor protected speech, threaten the privacy and security of people around the world, and have disastrous consequences for many children,” wrote more than 90 groups in an August letter to the company.

Those earlier critics welcomed Apple’s temporary reversal.

Advertisement

Apple didn’t report as many child sex images to the National Center for Missing & Exploited Children as its tech peers last year. In 2020, Facebook reported more than 20 million and Google reported more than half a million, compared to just 265 for Apple. When Apple originally announced its plans for the child protection features, the center and other groups welcomed them.

“Apple’s expanded protection for children is a game changer,” said John Clark, president and CEO of the center. “With so many people using Apple products, these new safety measures have lifesaving potential for children who are being enticed online and whose horrific images are being circulated in child sexual abuse material.”

Tim Starks

Written by Tim Starks

Tim Starks is senior reporter at CyberScoop. His previous stops include working at The Washington Post, POLITICO and Congressional Quarterly. An Evansville, Ind. native, he's covered cybersecurity since 2003.

Latest Podcasts