Apple drops idea of ​​scanning photos for child pornography

It is in the course of a declaration at Wired that Apple is discreetly burying a measure announced in the heart of summer 2021 for fight against pedophilia, and which did not go unnoticed. The manufacturer abandons its idea of detect child pornography images contained in the photo libraries on its devices.

The system designed by Apple for detecting CSAM photos.

This CSAM detection (for Child Sexual Abuse Material) was to be carried out both locally and then on Apple’s servers, but the manufacturer managed the communication around this project so badly that quickly, he put it on hold. Before letting go completely, therefore.

One of the sins of Apple in this story is to have launched the flower with a gun by failing to consult with the specialists and scientists who work on the technologies for detecting these images. This has raised serious concerns about the security and privacy of user data.

« We decided not to go ahead with our CSAM detection tool that we had proposed for iCloud Photos. Children can be protected without companies combing through personal data “, explains the manufacturer. ” We will continue to work with governments, child protection organizations and other businesses to help protect children, uphold their right to privacy and make the internet a safer place for all. ».

One measure of the “package” announced in August 2021 has been implemented without causing controversy: the blurring of nude images in Messages for young users. This function, performed entirely locally, is available in France since iOS 16.

The abandonment of this function certainly goes hand in hand with the announcement of end-to-end encrypted backup of photos in iCloud.

iOS 16.2 will be able to end-to-end encrypt photos and backups in iCloud

iOS 16.2 will be able to end-to-end encrypt photos and backups in iCloud

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.