Apple to Scan US IPhones for Images of Child Sexual Abuse (VOA News)

    16
    0

    Apple to Scan US IPhones for Images of Child Sexual Abuse – By Associated Press (VOA News) / Aug 5 2021

    Apple has unveiled plans to scan U.S. iPhones for images of child sexual abuse, drawing applause from child protection groups but raising concern among some security researchers that the system could be misused, including by governments looking to surveil their citizens.

    The tool designed to detect known images of child sexual abuse, called “neuralMatch,” will scan images before they are uploaded to iCloud. If it finds a match, a human will review the image. If child pornography is confirmed, the user’s account will be disabled and the National Center for Missing and Exploited Children will be notified.

    The system will not flag images not already in the center’s child pornography database. Parents snapping innocent photos of a child in the bath presumably need not worry. But researchers say the matching tool — which doesn’t “see” such images, just mathematical “fingerprints” that represent them — could be put to more nefarious purposes.

    Matthew Green, a top cryptography researcher at Johns Hopkins University, warned that the system could be used to frame innocent people by sending them seemingly innocuous images designed to trigger matches for child pornography. That could fool Apple’s algorithm and alert law enforcement. “Researchers have been able to do this pretty easily,” he said of the ability to trick such systems.

    CONTINUE > https://www.voanews.com/silicon-valley-technology/apple-scan-us-iphones-images-child-sexual-abuse

    [pro_ad_display_adzone id="404"]

    LEAVE A REPLY

    Please enter your comment!
    Please enter your name here