Illustration by Alex Castro / The Verge
When Apple introduced its slate of initiatives to prevent the spread of child sexual abuse material, or CSAM, last year, they were controversial, to say the least. While some praised the company for taking action, there was also no shortage of detractors, some of whom said that Apple’s plans to do on-device scanning for illegal content would require an unacceptable huge hit to user privacy.
The backlash caused Apple to delay some of the features in September 2021, and earlier this week, the company confirmed it has abandoned its efforts to create the hashing system that would’ve searched people’s iCloud photo libraries for illegal materials. We contacted some of the organizations that had spoken out either in support of or against…