Home  ›  News  ›

Apple to Scan iCloud Photos for Child Sexual Abuse Material

Article Comments  3  

Aug 6, 2021, 8:12 AM   by Rich Brome

A new feature in iOS 15 and macOS will scan users' iCloud photos for known child sexual abuse material (CSAM). The feature preserves privacy by keeping photos encrypted over the network and in the cloud. The new scanning takes place on the user's device, comparing users' photos against a database of known child abuse imagery, provided by child protection organizations like the National Center for Missing & Exploited Children (NCMEC) and others. New technology developed by Apple called NeuralHash can match photos even after a certain amount of editing, and without adding any actual CSAM to the user's device. A cryptographic principle called threshold secret sharing allows Apple to decrypt and examine offending images uploaded to its cloud only after a threshold is cleared for a certain amount of CSAM detected by NeuralHash. When that happens, Apple can "manually verify the contents, disable a user's account and report the imagery to NCMEC, which is then passed to law enforcement." Apple promises that an appeal process will be available. The feature will be active only in the US at first. Privacy experts have expressed concern over the feature, as some governments may be tempted to force Apple to search for other types of imagery. Privacy advocates also expressed concern that the system could be abused by third parties to implicate innocent people.

TechCrunch »

Related

more news about:

Apple
iOS
 

Comments

This forum is closed.

This forum is closed.

skuzz

Aug 6, 2021, 1:24 PM

Post revision suggestion

It says "will scan users' iCloud photos" it should probably just read "will scan users' photos". iCloud has nothing to do with this as it all happens on-device and I would imagine it will scan whether or not the user is using iCloud.
I'm not sure if it's only the cloud right now or not. But you can bet they'll figure out a way to scan your phone directly. And of course it's "for public safety." They say they are looking for child porn today, but next year they'll be looking for...
(continues)
gloopey1

Aug 6, 2021, 9:56 AM

What Could Possibly Go Wrong?

I mean, we all know how reliable AI scanning tools are.
 
 
Page  1  of 1

Subscribe to news & reviews with RSS Follow @phonescoop on Threads Follow @phonescoop on Mastodon Phone Scoop on Facebook Follow on Instagram

 

Playwire

All content Copyright 2001-2024 Phone Factor, LLC. All Rights Reserved.
Content on this site may not be copied or republished without formal permission.