Illustration by Alex Castro / The Verge

WhatsApp won’t be adopting Apple’s new Child Safety measures, meant to stop the spread of child abuse imagery, according to WhatsApp’s head Will Cathcart. In a Twitter thread, he explains his belief that Apple “has built software that can scan all the private photos on your phone,” and said that Apple has taken the wrong path in trying to improve its response to child sexual abuse material, or CSAM.

Apple’s plan, which it announced on Thursday, involves taking hashes of images uploaded to iCloud and comparing them to a database that contains hashes of known CSAM images. According to Apple, this allows it to keep user data encrypted and run the analysis on-device while still allowing it to report users to the authorities if they’re found…

Continue reading…

By

Leave a Reply