google.com, pub-3510811236185555, DIRECT, f08c47fec0942fa0

Breaking News

Apple says its iCloud scanning will rely on multiple child safety groups to address privacy fears

Illustration by Alex Castro / The Verge

Apple has filled in more details around its upcoming plans to scan iCloud Photos for child sexual abuse material (CSAM) via users’ iPhones and iPads. The company released a new paper delving into the safeguards it hopes will increase user trust in the initiative. That includes a rule to only flag images found in multiple child safety databases with different government affiliations — theoretically stopping one country from adding non-CSAM content to the system.

Apple’s upcoming iOS and iPadOS releases will automatically match US-based iCloud Photos accounts against known CSAM from a list of image hashes compiled by child safety groups. While many companies scan cloud storage services remotely, Apple’s device-based strategy has drawn...

Continue reading…



from The Verge - All Posts https://ift.tt/3iJ6OBK
via IFTTT

No comments

Waiting for your comment...