Apple Promises To Refuse Any Government’s Demands To Probe Users’ Phones

Apple — which is rolling out a new system to scan iCloud photo rolls for child sexual abuse materials (CSAM) — promised that it would refuse to use such a system for political purposes.

On Sunday, a statement from Apple sought to dismiss worries that governments could add non-CSAM images to its hash list:

Apple will refuse any such demands. Apple’s CSAM detection capability is built solely to detect known CSAM images stored in iCloud Photos that have been identified by experts at NCMEC and other child safety groups. We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future. 

Let us be clear, this technology is limited to detecting CSAM stored in iCloud and we will not accede to any government’s request to expand it. Furthermore, Apple conducts human review before making a report to NCMEC. In a case where the system flags photos that do not match known CSAM images, the account would not be disabled and no report would be filed to NCMEC.

Leading skeptics of Apple’s program include National Security Agency whistleblower Edward Snowden.

“No matter how well-intentioned, Apple is rolling out mass surveillance to the entire world with this,” he commented on Twitter. “Make no mistake: if they can scan for kiddie porn today, they can scan for anything tomorrow. – READ MORE