According to recent reports, Apple has been scanning iCloud e-mails for child abuse imagery since 2019. Many privacy experts are worried about Apple’s decision to begin doing so on user’s local devices.
CNET reports that tech giant Apple has been scanning emails stored in the company’s iCloud service for child abuse imagery since 2019. The news comes as Apple faces increased scrutiny over its recent decision to scan user devices for Child Sexual Abuse Imagery (CSAM), which has worried many privacy experts who have warned that Apple could be influenced into scanning user devices for content other than images of child sexual abuse.
Apple claims that the way it detects CSAM is “designed with user privacy in mind,” and it is not directly accessing iCloud users’ photos, but rather utilizing a device-local, hash-based lookup and matching system to cross-reference the hashes of user photos with the hashes of known CSAM. If there is a match between a user’s photos and the CSAM database, Apple manually reviews the issue and will then disable the user’s account before sending a report to NCMEC.
However, many privacy experts are extremely worried about the new system. NSA whistleblower Edward Snowden tweeted about the issue stating: “No matter how well-intentioned, @Apple is rolling out mass surveillance to the entire world with this. Make no mistake: if they can scan for kiddie porn today, they can scan for anything tomorrow.”
Apple recently revealed that it has been scanning iCloud Mail emails for CSAM for two years now, a detail that was not previously disclosed to customers. Apple said on previous versions of its website that it “uses image matching technology to help find and report child exploitation” by looking at “electronic signatures.”- READ MORE