Apple Announces Limits to Child Sex Abuse Image-Scanning System After Privacy Backlash

Apple on Aug. 13 provided new details of how its planned child sexual abuse material (CSAM) detection system would work, outlining a range of privacy-preserving limits following backlash that the software would introduce a backdoor that threatens user privacy protections.

The company addressed concerns triggered by the planned CSAM feature, slated for release in an update for U.S. users later this year, in a 14-page document (pdf) that outlined safeguards it says it will implement to prevent the system on Apple devices from erroneously flagging files as child pornography, or being exploited for malicious surveillance of users.

“The system is designed so that a user need not trust Apple, any other single entity, or even any set of possibly-colluding entities from the same sovereign jurisdiction (that is, under the control of the same government) to be confident that the system is functioning as advertised,” the company said in the document.

Apple’s reference to “possibly-colluding entities” appears to address concerns raised by some that the system could be abused—for instance, by authoritarian regimes—to falsely incriminate political opponents.

“This is achieved through several interlocking mechanisms, including the intrinsic auditability of a single software image distributed worldwide for execution on-device, a requirement that any perceptual image hashes included in the on-device encrypted CSAM database are provided independently by two or more child safety organizations from separate sovereign jurisdictions, and lastly, a human review process to prevent any errant reports,” Apple stated.

Besides ensuring that the database of child sexual abuse imagery that Apple will check against is not controlled by a single entity or government, Apple’s other technical protections against mis-inclusion include choosing a high match threshold so that “the possibility of any given account being flagged incorrectly is lower than one in one trillion.” At the same time, Apple said the system would prevent privacy violations by never learning any information about iCloud-stored images that don’t have a positive match to the CSAM database. – READ MORE

Related Articles

Hot Wire — Spy School; Protecting You & Yours — The Intel Juice is Worth the Squeeze

CLICK HERE to Get or Download the Outline PDF for this material and video.  What is Intelligence?  Intelligence is truthful, timely, actionable, reliable information that can be used for certain needs, for and about operational environments and threats.  Strategy  Before covering and applying this material a good mindset to have in place is that from…

To access this post, you must purchase The Hot Wire or The Hot Wire (DG).

(PREMIUM) PAINE IN THE MORNING: What you need to know this Tuesday – April 11, 2023

If video above does not play we will update it at approx 8 am EST — Audio below is working — THANK YOU  Here’s what you need to know today, Tuesday – April 11, 2023. Enjoy this episode of ‘Paine in the Morning.’ Privacy group warns worldwide web scanning scheme being considered – A privacy…

To access this post, you must purchase The Hot Wire, The Hot Wire (DG), Monthly Supporter​ or Monthly Supporter​ (DG).

Responses