Apple executive Craig Federighi says iCloud Photos’ plans to scan for child sexual abuse material (or CSAM) will include “multiple levels of auditability.” In an interview with The Wall Street Journal, Federighi — Apple’s senior vice president of software engineering — offered new details about its controversial child safety measures. That includes a claim that the iPad and iPhone’s device-level scanning will help security experts verify that Apple is using the system responsibly.

Like many companies with cloud storage services, Apple will check iCloud Photos images against a list from the National Center for Missing and Exploited Children (NCMEC), looking for exact matches with known CSAM pictures. But unlike many services, it will run…

Continue reading…

By

Leave a Reply