US civil rights organizations are up against Apple’s plan to initially use a nude filter for iMessage in the US and to scan photos uploaded to iCloud for depictions of child sexual abuse. The Center for Democracy & Technology (CDT) fears that the initiative will ultimately endanger the secure transmission of messages via the iPhone manufacturer’s messaging service “all over the world”.
An “Infrastructure for Surveillance and Censorship”
The child protection launched by Apple is undoubtedly an important goal, according to the CDT. The civil society institution, however, is “deeply concerned” that the changes “in reality bring new risks for children and all users and represent a clear departure from long-established data protection and security protocols”.
“Apple is replacing its industry-standard, end-to-end encrypted messaging system with an infrastructure for surveillance and censorship that is prone to abuse not just in the US, but around the world,” complains Greg Nojeim, co-director of the Security & Surveillance project CDT. The company should forego this step “and restore the trust of its users in the security and integrity of their data on Apple devices and services”.
Dangerous precedent feared
The company is not creating an alternative to a back door, as it says itself, but clearly “a back door”, explains the organization. iMessage will then no longer offer end-to-end encryption. This would create a dangerous precedent for one user account to monitor another. A feature will be added to iOS that will scan pictures in iMessages that have a family account. Apple performed machine learning-based “client-side scanning” on these accounts to identify sexually explicit imagery. In the event of a suspected hit, the parents are warned.
With the new photo rule, a database with hashes of material about child sexual abuse will also be stored on users’ iPhones, the CDT explains. After a comparison with unpublished algorithms and a human review, Apple sends, among other things, a report to the National Center for Missing and Exploited Children (NCMEC).
.