Leer en español. Today, Apple confirmed it would “take additional time over the coming months to collect input and make improvements” before releasing features it announced in August that raised serious privacy concerns among activists, technologists, and even Apple’s own employees.
Designed with the intention to combat distribution of child sexual abuse material (CSAM), one of the planned features would have introduced a system for comparing the hashes of all images set to be uploaded to iCloud against a database of hashes for known CSAM, stored in the device operating system.
In a meeting with Access Now on August 19, 2021, senior staff at Apple confirmed these features were not scheduled to be included in the initial launch of iOS 15.0 planned in the coming weeks, but instead to be rolled out in a later iOS 15 software update over the next several months.
“Apple’s formal commitment to press pause on this feature is an important step toward restoring trust in its approach to privacy and security,” said Carolyn Tackett, Deputy Advocacy Director at Access Now. “We look forward to an even stronger commitment from Apple to leave this update on the shelf for good. Introducing this system into all iOS devices would severely undermine Apple’s reputation as a defender of privacy and leader on device security around the world, and it would open the door to extensive abuse that, once opened, would be nearly impossible for Apple to shut.”
“Child safety is extremely important, as is the work that authorities do to prevent the production of child sexual abuse materials (CSAM) and detect and eliminate their spread online. The private sector can do its part by strengthening the privacy and security of all users, including children, a vulnerable group whose rights and interests often are overlooked. End-to-end encryption contributes to ensuring such privacy and security, and any measures, such as those proposed by Apple, that circumvent such encryption, would ultimately do more harm than good, and be exploited for purposes far beyond those for which they were implemented,” said Isedua Oribhabor, Business and Human Rights Lead at Access Now.
“We welcome Apple’s decision to adopt a more open and transparent process for fully assessing these features’ significant impact on privacy, security, and free expression. This is what should have been done at the outset. Online safety is a goal that can only be achieved through sustained multistakeholder engagement that accounts for technological innovations’ immediate and long-term impact on human rights and for the unique threats vulnerable communities around the world are facing,” said Raman Jit Singh Chima, Global Cybersecurity Lead at Access Now.
We hope that Apple follows up on its decision to hit pause with a commitment to permanently remove this feature from its plans and to maintain its leadership on device security and privacy. We also hope to see sustained, in-depth consultation that includes engagement with civil society organizations, technical experts, privacy advocates, researchers, individuals impacted by its technologies, and vulnerable communities in particular, with a focus on empowering users and strengthening people’s privacy and security, without side-stepping end-to-end encryption.