May 5, 2024

Apple Abandons Planned 'Backdoor' for iPhone

Posted December 7, 2022 at 9:34pm by iClarified · 4093 views
Apple has abandoned its controversial plan to build a 'backdoor' into the iPhone for CSAM detection, according to a statement provided to Wired. Instead, the company plans to improve its Communication Safety feature.

"After extensive consultation with experts to gather feedback on child protection initiatives we proposed last year, we are deepening our investment in the Communication Safety feature that we first made available in December 2021," the company told WIRED in a statement. "We have further decided to not move forward with our previously proposed CSAM detection tool for iCloud Photos. Children can be protected without companies combing through personal data, and we will continue working with governments, child advocates, and other companies to help protect young people, preserve their right to privacy, and make the internet a safer place for children and for us all."

On August 5, 2021, Apple announced it would starting scanning user's photos and messages for child pornography. The move was viewed as a shocking about-face for users who have relied on the company's leadership in privacy and security. Outrage was immediate from users, researchers, and organizations like the Electronic Frontier Foundation. This resulted in Apple pausing its plans on September 3, 2021. By December 15, 2021, references to CSAM were scrubbed from the Child Safety webpage at Apple.com; however, the company did not officially abandon its plan until today.

Parents can opt into Communication Safety features through family iCloud accounts. Once enabled, Messages will detect if a child has received or is attempting to send a photo with nudity. The app will blur the photo before it's viewed, providing guidance and age-appropriate resources to help them make a safe choice, including contacting someone they trust if they choose. Messages uses on-device machine learning to analyze image attachments and determine if a photo appears to contain nudity. The feature is designed so that Apple doesn't get access to the photos.

"Potential child exploitation can be interrupted before it happens by providing opt-in tools for parents to help protect their children from unsafe communications," the company said in its statement. "Apple is dedicated to developing innovative privacy-preserving solutions to combat Child Sexual Abuse Material and protect children, while addressing the unique privacy needs of personal communications and data storage."

More details in the full report linked below...

Read More