Apple to scan iPhones for child sexual abuse materials

Apple to scan iPhones for child sex abuse images
Source: PA MEDIA
2 Min Read

Daily US Times: Apple has announced the details about a scanning system to find child sexual abuse material (CSAM) on iPhones and iPads.

Before an image is stored onto iCloud Photos, the new technology, which Apple claims is extremely accurate, will search for matches of already known child sexual abuse material (CSAM).

The iPhone maker said that if a match is found a human reviewer will then assess it and report the user to law enforcement.

There are privacy concerns, however, that the new technology could be expanded to scan phones for prohibited political speech and content.

The system could be used by authoritarian governments to spy on its citizens, experts worry.

Apple said that new versions of iOS and iPadOS will have “new applications of cryptography to help limit the spread of CSAM online, while designing for user privacy”. The new versions is due to be released later this year.

The technology works by comparing pictures to a database of known child sexual abuse images compiled by the National Center for Missing and Exploited Children (NCMEC) and other child safety organisations.

Those images are translated into “hashes”, numerical codes that can be “matched” to an image on an iPhone and iPad.

The tech giant says the system will also catch edited but similar versions of original images.

Apple said: “Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the known CSAM hashes.”

The company claimed the technology had ”less than a one in one trillion chance per year of incorrectly flagging a given account”.

You may read: US plans to require Covid vaccine for foreign travellers