February 21, 2024
Apple Abandons Planned 'Backdoor' for iPhone

Apple Abandons Planned 'Backdoor' for iPhone

Posted December 7, 2022 at 9:34pm by iClarified · 4072 views
Apple has abandoned its controversial plan to build a 'backdoor' into the iPhone for CSAM detection, according to a statement provided to Wired. Instead, the company plans to improve its Communication Safety feature.

"After extensive consultation with experts to gather feedback on child protection initiatives we proposed last year, we are deepening our investment in the Communication Safety feature that we first made available in December 2021," the company told WIRED in a statement. "We have further decided to not move forward with our previously proposed CSAM detection tool for iCloud Photos. Children can be protected without companies combing through personal data, and we will continue working with governments, child advocates, and other companies to help protect young people, preserve their right to privacy, and make the internet a safer place for children and for us all."

On August 5, 2021, Apple announced it would starting scanning user's photos and messages for child pornography. The move was viewed as a shocking about-face for users who have relied on the company's leadership in privacy and security. Outrage was immediate from users, researchers, and organizations like the Electronic Frontier Foundation. This resulted in Apple pausing its plans on September 3, 2021. By December 15, 2021, references to CSAM were scrubbed from the Child Safety webpage at Apple.com; however, the company did not officially abandon its plan until today.

Parents can opt into Communication Safety features through family iCloud accounts. Once enabled, Messages will detect if a child has received or is attempting to send a photo with nudity. The app will blur the photo before it's viewed, providing guidance and age-appropriate resources to help them make a safe choice, including contacting someone they trust if they choose. Messages uses on-device machine learning to analyze image attachments and determine if a photo appears to contain nudity. The feature is designed so that Apple doesn't get access to the photos.

"Potential child exploitation can be interrupted before it happens by providing opt-in tools for parents to help protect their children from unsafe communications," the company said in its statement. "Apple is dedicated to developing innovative privacy-preserving solutions to combat Child Sexual Abuse Material and protect children, while addressing the unique privacy needs of personal communications and data storage."

More details in the full report linked below...

Read More

Apple Abandons Planned 'Backdoor' for iPhone
Add Comment
Would you like to be notified when someone replies or adds a new comment?
Yes (All Threads)
Yes (This Thread Only)
iClarified Icon
Would you like to be notified when we post a new Apple news article or tutorial?
You must login or register to add a comment...
onereader - December 7, 2022 at 11:33pm
In other words, people interested in dealing with those type of pictures can safely return to their regular fishing. I’m not saying what Apple was doing was good before, but how many parents are going to use that feature?
AntiNeo - December 8, 2022 at 1:43am
Government in China checking and deleting videos of recent protests - straight from people’s phones cam-rolls. You see … soon after - any back door in your phones privacy will be used not quite for what it was originally created for.
onereader - December 9, 2022 at 4:44am
Thats why we should never become China, especially that we, for years, have our fingerprints online knowing that everyone respects that. If we become another China everyone has something in their web navigation that can …
Recent. Read the latest Apple News.
Tutorials. Help is here.
Where to Download macOS Monterey
Where to Download macOS Ventura
AppleTV Firmware Download Locations
Where To Download iPad Firmware Files From
Where To Download iPhone Firmware Files From
Deals. Save on Apple devices and accessories.