March 28, 2024
Apple Announces It Will Scan User Photos for Child Pornography

Apple Announces It Will Scan User Photos for Child Pornography

Posted August 5, 2021 at 8:28pm by iClarified
Apple has announced steps to limit the spread of Child Sexual Abuse Material (CSAM) which includes the scanning of photos stored by its users.

-----
Apple is introducing new child safety features in three areas, developed in collaboration with child safety experts. First, new communication tools will enable parents to play a more informed role in helping their children navigate communication online. The Messages app will use on-device machine learning to warn about sensitive content, while keeping private communications unreadable by Apple.

Next, iOS and iPadOS will use new applications of cryptography to help limit the spread of CSAM online, while designing for user privacy. CSAM detection will help Apple provide valuable information to law enforcement on collections of CSAM in iCloud Photos.


Finally, updates to Siri and Search provide parents and children expanded information and help if they encounter unsafe situations. Siri and Search will also intervene when users try to search for CSAM-related topics.
-----

These features are coming to users in the U.S. later this year via updates to iOS 15, iPadOS 15, watchOS 8, and macOS Monterey.

New technology in iOS and iPadOS will allow Apple to detect known CSAM images stored in iCloud Photos. This will enable Apple to report these instances to the National Center for Missing and Exploited Children (NCMEC).

Here's how Apple describes the process by which it will scan user photos...


-----
Apple’s method of detecting known CSAM is designed with user privacy in mind. Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC and other child safety organizations. Apple further transforms this database into an unreadable set of hashes that is securely stored on users’ devices.

Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the known CSAM hashes. This matching process is powered by a cryptographic technology called private set intersection, which determines if there is a match without revealing the result. The device creates a cryptographic safety voucher that encodes the match result along with additional encrypted data about the image. This voucher is uploaded to iCloud Photos along with the image.

Using another technology called threshold secret sharing, the system ensures the contents of the safety vouchers cannot be interpreted by Apple unless the iCloud Photos account crosses a threshold of known CSAM content. The threshold is set to provide an extremely high level of accuracy and ensures less than a one in one trillion chance per year of incorrectly flagging a given account.

Only when the threshold is exceeded does the cryptographic technology allow Apple to interpret the contents of the safety vouchers associated with the matching CSAM images. Apple then manually reviews each report to confirm there is a match, disables the user’s account, and sends a report to NCMEC. If a user feels their account has been mistakenly flagged they can file an appeal to have their account reinstated.

This innovative new technology allows Apple to provide valuable and actionable information to NCMEC and law enforcement regarding the proliferation of known CSAM. And it does so while providing significant privacy benefits over existing techniques since Apple only learns about users’ photos if they have a collection of known CSAM in their iCloud Photos account. Even in these cases, Apple only learns about images that match known CSAM.
-----

While good intentioned, this system is likely to upset privacy advocates. Experts have already said this type of scanning could lead to abuses by authoritarian governments.

Security researcher Alec Muffett tweeted, "absolutely the most important story that broke overnight is that Apple apparently intend to launch a feature enabling image detection & government surveillance on everyone's iPhones in the name of child protection."

"Apple are walking back privacy to enable 1984."

You can learn more about how these new features will work at the link below. Let us know what you think in the comments...

Read More


Apple Announces It Will Scan User Photos for Child Pornography
Add Comment
Would you like to be notified when someone replies or adds a new comment?
Yes (All Threads)
Yes (This Thread Only)
No
iClarified Icon
Notifications
Would you like to be notified when we post a new Apple news article or tutorial?
Yes
No
Comments
You must login or register to add a comment...
Recent. Read the latest Apple News.
RECENT
Tutorials. Help is here.
TUTORIALS
Where to Download macOS Sonoma
AppleTV Firmware Download Locations
Where to Download macOS Ventura
Where To Download iPad Firmware Files From
Where To Download iPhone Firmware Files From
Deals. Save on Apple devices and accessories.
DEALS