Home  ›  News  ›

Apple Delays CSAM Photo-Scanning Feature

Article Comments  

Sep 3, 2021, 10:55 AM   by Rich Brome

Apple will "take additional time to make improvements" before launching its technology to detect and report known child sexual abuse material (CSAM) in users's iCloud photos. The tools, first revealed a month ago, are designed to preserve user privacy by using sophisticated hashing algorithms that run on the user's device. Only once a threshold of at least 30 CSAM images is detected can photos be decrypted by Apple for manual inspection and potential reporting to authorities. Privacy experts have expressed concern over the feature, as some governments may be tempted to force Apple to search for other types of imagery. Privacy advocates also expressed concern that the system could be abused by third parties to implicate innocent people. Apple has responded to these concerns, stating that its database of known CSAM will never include images reported from just one country, and researchers will be able to verify that all Apple devices are using the same database of known CSAM.

Wall Street Journal »

Related

more news about:

Apple
 

Comments

This forum is closed.

This forum is closed.

No messages

 
 
Page  1  of 1

Subscribe to news & reviews with RSS Follow @phonescoop on Threads Follow @phonescoop on Mastodon Phone Scoop on Facebook Follow on Instagram

 

Playwire

All content Copyright 2001-2024 Phone Factor, LLC. All Rights Reserved.
Content on this site may not be copied or republished without formal permission.