Photo by Vjeran Pavic / The Verge
Apple plans to scan photos stored on iPhones and iCloud for child abuse imagery, according the Financial Times. The new system could help law enforcement in criminal investigations but may open the door to increased legal and government demands for user data.
The system, called neuralMatch, will “proactively alert a team of human reviewers if it believes illegal imagery is detected, who would then contact law enforcement if the material can be verified,” the Financial Times said. neuralMatch, which was trained using 200,000 images from the National Center for Missing & Exploited Children, will roll out first in the US. Photos will be hashed and compared with a database of known images of child sexual abuse.