May 1, 2024

Apple to Start Scanning User Photos for Child Pornography [Report]

Posted August 5, 2021 at 2:05pm by iClarified · 5210 views
Apple is purportedly planning to release a tool tomorrow that will scan user's photos for Child Sexual Abuse Material, according to Matthew Green, a professor of Cryptography at John Hopkins.

I’ve had independent confirmation from multiple people that Apple is releasing a client-side tool for CSAM scanning tomorrow. This is a really bad idea. These tools will allow Apple to scan your iPhone photos for photos that match a specific perceptual hash, and report them to Apple servers if too many appear. Initially I understand this will be used to perform client side scanning for cloud-stored photos. Eventually it could be a key ingredient in adding surveillance to encrypted messaging systems.

If Apple does release such a tool, the privacy ramifications will be significant. Green notes that while the tool can be used for finding child pornography in people's phones, it could be misused by authoritarian governments.

Apple is purportedly using its own proprietary neural hashing algorithm. Even if the system wasn't "misused", it's possible a harmless file could share a hash with a child porn file, incorrectly flagging a user as storing child pornography.

Green says Apple will start with non-E2E photos that have already been shared with iCloud, "so it doesn't 'hurt' anyone's privacy".

But you have to ask why anyone would develop a system like this if scanning E2E photos wasn’t the goal.

He also suggests that Apple offering such a system will "break the damn" leading to all governments demanding it from everyone.

More details in the thread linked below...

Please download the iClarified app or follow iClarified on Twitter, Facebook, YouTube, and RSS for updates.

Read More [via 9to5Mac]