Apple is Now Scanning Photos Uploaded to iCloud for Child Abuse

Apple is Now Scanning Photos Uploaded to iCloud for Child Abuse

Registered users can display a profile image alongside their comment.
You can upload your photo here. If you haven't registered you can do so here
Add Comment
1
tobybartlett - January 9, 2020 at 9:04pm
What I don’t understand, is that if someone is engaging in something so disgusting and universally illegal as child pornography, are they dumb enough to turn iCloud backups on and toggle the Photos option to on? I’d be curious to know if this has ever caught one of these sickos. If my phone were full of incriminating files of any sort, I wouldn’t back up to the cloud but Apple seems to think they are? Knowing that this tactic has succeeded would make me much More comfortable for even an algorithm scanning my files. This is a quote from iClarified: “ As a reminder, if you are using Apple's iCloud services your data is not secure. Apple has the encryption keys to your photos, contacts, calendars, bookmarks, mail, notes, voice memos, health, call history, and files. Even your messages are not secure if you have iCloud backup enabled.“
2
1reader - January 9, 2020 at 9:20pm
Exactly! 👍🏻
the voice of reason - January 10, 2020 at 12:47am
@toby, I think the answer to your question is that most images are now stored and shared online, which is still probably "safer" for the sickos than doing so in physical form. In order for someone not to use digital files, they'd have to take their own photos and to do so they'd have to commit the actual molestation themselves, and then keep the physical evidence. There are documentaries which show the depravity of people that create and distribute this filth in dark-web marketplaces. This Apple filter is a good thing, let's just make sure it doesn't get used or exploited for other illicit reasons. Balance, as always.
2
tobybartlett - January 10, 2020 at 2:01am
@voiceofreason (I love your handle BTW) I agree these images and videos are distributed online, and have been for decades now. But perhaps I’m giving these perverts too much credit when it comes to storing them. I’m Canadian, and in my hometown we have the offices of the Canadian Centre for Child Protection. It is a national charity to help exploited children and survivors. It’s a fantastic charity and they do very important work funded from corporations and the Canadian Government. In my support of this charity I’ve had the opportunity to speak to people who had or still did work in this field. What I heard was nothing less than shocking—I’m speaking about what these pedophiles do to share content across the Internet without getting caught. I work in tech (and while I’m not a network engineer) I was able to follow (for the most part) the insanely technically complicated setups these groups use to avoid being caught. Layers of encryption, blockchain tools, using multiple VPNs across multiple jurisdictions to slow down law enforcement and ToR just to name a few. So, because I’ve heard all of this from so many people who’s jobs are to assist catching these guys saying that they would turn iCloud backups on just throws me for a loop. Their servers will be on another continent, behind layers and layers of encryption with keys shared via pieces of paper that they hand deliver by flying across the globe, yet Apple wants to scan the photos of its user base for these photos? Maybe (and I do want to give Apple the benefit of the doubt here), the not-so-organized sickos and the lone perverts that aren’t part of a multinational child porn ring is who they’re after. I don’t know enough about the topic to understand if this is such a large problem, that Apple—the people who position the features of their Photos privacy on their website—need to scan billions of images to catch these criminals. It just seems odd is all. I don’t have any images I wouldn’t mind ending up on a Times Square billboard. I’m not that interesting, nor am I a pedophile. I am, however, a huge privacy advocate and as such, algorithms looking over my photos is a slippery slope that needs to be monitored. I’ll wrap this up because this is turning into an article, but if I search on my iPhone for “dogs” in my photos app, the AI finds dogs. But it achieves this with on-device AI. Moving it off device goes against everything Tim Cook keeps saying about Apple. I am not saying these perverts deserve a modicum of privacy or anything remotely close to that. I just want to know that this scanning of photos can’t be abused. Toby