(ETH) – Apple just unveiled plans to scan U.S. iPhones for images of child sexual abuse, drawing applause from child protection groups but raising concern among some security researchers that the system could be misused, including by governments looking to surveil their citizens.

According to the Associated Press, The tool designed to detected known images of child sexual abuse, called “neuralMatch,” will scan images before they are uploaded to iCloud. If it finds a match, the image will be reviewed by a human. If child pornography is confirmed, the user’s account will be disabled and the National Center for Missing and Exploited Children notified.

Separately, Apple plans to scan users’ encrypted messages for sexually explicit content as a child safety measure, which also alarmed privacy advocates. The detection system will only flag images that are already in the center’s database of known child pornography.


Advertisement


CNBC stated that Apple began testing the system on Thursday, but most U.S. iPhone users won’t be part of it until an iOS 15 update later this year, Apple said. The move brings Apple in line with other cloud services which already scan user files, often using hashing systems, for content that violates their terms of service, including child exploitation images.

It also represents a test for Apple, which says that its system is more private for users than previous approaches to eliminating illegal images of child sexual abuse, because it uses sophisticated cryptography on both Apple’s servers and user devices and doesn’t scan actual images, only hashes.

But many privacy-sensitive users still recoil from software that notifies governments about the contents on a device or in the cloud, and may react negatively to this announcement, especially since Apple has vociferously defended device encryption and operates in countries with fewer speech protections than the U.S.