Apple to scan iPhones for images of child sexual abuse
This move has drawn applause from child protection groups but has reaised concerns as the system could be misused to put surveillance on the citizens.
In an effort to curb child pornography, the Apple Inc has decided to unveil a tool to scan US iPhones for images of child sexual abuse. This move has drawn applause from child protection groups but has reaised concerns as the system could be misused to put surveillance on the citizens.
The tool used for the scanning purpose is called "neural Match", which will scan images even before they are uploaded on iCloud. If the agency finds a match, it will be reviewed by a human, and if it is confirmed, the user's account will be disabled, and the National Center for Missing and Exploited Children notified.
The detection system will only flag images already a part of the known child pornography. Technology companies including Microsoft, Google, Facebook, and others have been sharing digital fingerprints of known child sexual abuse in other news. This new tool has to be used carefully as the company has to stop children exploitation and also keep its commitment to protect the priavcy of their users.
The new tool will be implemented in its iPhones, Macs and Apple Watches, and the latest changes will roll out this year as part of updates to its operating software.
Meanwhile, the comapny said that its messaging app would use an on-device machine to identify and blur sexually explicit photos on children's phones and warn the parents of younger children via text message. Parents will have to enrol their child's phones to receive warnings about sexually explicit images on their children's devices. In response, the company said that the new feature would not compromise the security of private communiaction.