Apple removes mentions of controversial child abuse scanning from its site (Engadget)

    35
    0

    Apple removes mentions of controversial child abuse scanning from its site – By J. Fingas (Engadget) / December 15, 2021

    Don’t count on the image detection arriving any time soon.

    Apple has hinted it might not revive its controversial effort to scan for CSAM (child sexual abuse material) photos any time soon. MacRumors notes Apple has removed all mentions of the scanning feature on its Child Safety website. Visit now and you’ll only see iOS 15.2’s optional nude photo detection in Messages and intervention when people search for child exploitation terms.

    It’s not certain why Apple has pulled the references. We’ve asked Apple for comment. This doesn’t necessarily represent a full retreat from CSAM scanning, but it at least suggests a rollout isn’t imminent.

    While Apple was already scanning iCloud Photos uploads for hashes of known CSAM, the change would have moved those scans to the devices themselves to ostensibly improve privacy. If iCloud Photos was enabled and enough hashes appeared in a local photo library, Apple would decrypt the relevant “safety vouchers” (included with every image) and manually review the pictures for a potential report to the National Center for Missing and Exploited Children. That, in turn, could get police involved.

    CONTINUE > https://www.engadget.com/apple-removes-csam-mentions-on-website-151958410.html

    [pro_ad_display_adzone id="404"]

    LEAVE A REPLY

    Please enter your comment!
    Please enter your name here