Daily US Times: Apple regrets its announcement of automated tools to detect child sexual abuse on the iPad and iPhone saying it was “jumbled pretty badly”. There was a big confusion after Apple’s announcement.
The company revealed new image detection software on 5 August that can alert Apple if known illegal images are uploaded to its iCloud storage.
Privacy groups and many iPhone users criticised the news, with some saying Apple had created a security backdoor in its software.
The company now says its announcement about the issue had been widely “misunderstood”.
In an interview with the Wall Street Journal, Apple software chief Craig Federighi said: “We wish that this had come out a little more clearly for everyone.”
Federighi said that – in hindsight – introducing two features at the same time was “a recipe for this kind of confusion”.
The iPhone maker announced two new tools designed to protect children and the company announced that it will be deployed in the US first.
The first tool can identify known child sex abuse material (CSAM) when a user uploads photos to iCloud storage.
A database of known illegal child abuse images is being maintained by the US National Center for Missing and Exploited Children (NCMEC).
NCMEC stores the images as hashes – a digital “fingerprint” of the illegal material.
Cloud service providers such as Microsoft, Google and Facebook already check images against these hashes to make sure people are not sharing CSAM.
Apple decided to do the same thing, but said it would do the image-matching on a user’s iPad and iPhone before it was uploaded to iCloud.
You may read: John Cena posts his ‘Black twin,’ dubbed ‘Jamal Cena’