After facing backlash, Apple says it would seek abuse images flagged in multiple nations
Apple said that it had handled communications about the initiative badly, eliciting criticism from major technology policy groups and even its workers worried that the firm was jeopardising its reputation for safeguarding customer privacy.
In a recent development, Apple said it would hunt only for pictures that clearinghouses have flagged in multiple countries. The statement comes after a week of criticism over its planned new system for detecting images of child sexual abuse.
Apple said that it had handled communications about the initiative badly, eliciting criticism from major technology policy groups and even its workers worried that the firm was jeopardising its reputation for safeguarding customer privacy.
It would not say whether the criticism had affected any of the policies or software, but it did note that the project was still in the works and that adjustments were expected.
Apple announced last week that it would examine pictures before storing them on the iCloud online service, later clarifying that it would start with simply the United States. Similar inspections are performed by other technological firms whenever images are submitted to their systems. Apple's choice to install crucial system elements on the phone itself sparked fears that governments might push Apple to expand the system for other purposes, such as scanning for forbidden political images.