Data Protection & PrivacyTechnology Policy

After criticism, Apple delays CSAM detection feature

In response to severe criticism, Apple has postponed its controversial CSAM detection feature. On Friday, the company announced that it will take additional time to collect comments and improve suggested child safety features. The company further said:

“Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”

How does the feature work?

Apple announced this new technology in August, which would have allowed it to scan photos while they were being uploaded to iCloud.

Once a user chooses to upload an image to iCloud, the iPhone will calculate hash value of the image and match it will the hash value of known child sexual abuse images in the law enforcement database. A comparison of these hash values will reveal the presence of child sexual abuse material in a user’s iCloud storage.

These hash values are a one-way function that cannot be used to reconstruct images. Read more.

Apple’s system checks photos stored on phones before they are uploaded, rather than checking them after they reach the iCloud servers. Once the technology finds a match, a human review would be there before Apple suspends the account and reports it to law enforcement. The law enforcement database in question is provided by the National Center for Missing and Exploited Children (NCMEC). Later,

Days after the announcement, Apple clarified that it would only scan photographs that appeared in clearing houses in a few countries. It further clarified that the system will trigger a human review only when the feature detects 30 CSAM matches. The company also announced a second feature to scan images sent to children in its iMessage app. This system safeguards kids by detecting nudes/unsafe material and notifying parents.

Criticism of the Features

Both initiatives raised concerns among privacy advocates as well as within the company itself. Critics say governments may force Apple to use its scanning technology to look for other types of pictures, potentially exposing consumers to surveillance. However, Apple has also clarified that it will not comply with government requests for wider searches. 

Following the criticism, the company announced to delay this CSAM detection feature. Matthew Green, a cybersecurity researcher at Johns Hopkins University who earlier criticized Apple’s move, said Apple’s move was “promising.”

Further, last week the Electronic Frontier Foundation delivered a petition protesting the technology. The organisation said:

“What Apple intends to do will create an enormous danger to our privacy and security. It will give ammunition to authoritarian governments wishing to expand the surveillance, and because the company has compromised security and privacy at the behest of governments in the past, it’s not a stretch to think they may do so again.”


Do subscribe to our Telegram channel for more resources and discussions on tech-law. To receive weekly updates, don’t forget to subscribe to our Newsletter.

Rajat Chawda

Rajat is a student at the Institute of Law, Nirma University. Since a young age, he was fascinated by the technological advancements and his fascination with gadgets has helped him develop a keen interest in TMT Laws in his journey as a law student. He is associated with Mylawrd to further engage himself and learn in this area.

Share your thoughts!

This site uses Akismet to reduce spam. Learn how your comment data is processed.