Recently, Apple announced its plan to scan U.S. customers’ phones and computers for child sex abuse images. However, people working at Apple and other leading technology policy groups are very skeptical about the notable turn in the company’s culture that focuses on privacy.
Buckle under the strain
Apple’s employees are not very comfortable with this move. Its internal Slack Channel has flooded with more than 800 messages showcasing their concerns, Reuters reported. Many people are worried that repressive governments could exploit the NeuralHash technology looking for censorship or arrests. Further, people are also worried that Apple is damaging its leading reputation for protecting privacy.
The concerns are mainly coming from employees working outside security and privacy roles. The pushback marks a shift for a company where a strict code of secrecy around new products and other aspects is part of its corporate culture.
Technology Policy Group’s reaction
Core Security employees at Apple did not appear in threads in Slack Channel. Some of them said that this solution is a reasonable response to pressure to crack down on illegal material. Two groups, the Electronic Frontier Foundation (EFF) and the Center for Democracy and Technology (CDT) released their newly detailed objections to Apple’s plan. They say that while the U.S. government can’t legally scan wide swaths of household equipment for contraband or make others do so. Apple is doing it voluntarily with potentially dire consequences.
EFF said Police and other agencies will take a stand on recent laws requiring “technical assistance” in investigating crimes. This might prove detrimental in the long run.
In an interview, CDT Project Director Emma Llanso said, “What Apple is showing with their announcement last week is that there are technical weaknesses that they are willing to build in! It seems so out of step from everything that they had previously been saying and doing.”
Apple’s Justification & Previous Strong stance on data privacy
Apple did not explicitly say anything. However, it said it will refuse requests from governments to use the system to check phones for anything other than illegal child abuse material.
Many people are pointing to Apple’s stand against the FBI in 2016. The company successfully fought a court order to develop a new tool to crack into a terrorism suspect’s iPhone. The company at that time said, such a tool would inevitably be used to break into other devices for other reasons.
Other Critics’ concerns
Many critics have pointed out the fundamental problem with NeuralHash. Now the company is making cautious policy decisions that governments can force it to change. Now that the capability is there, it would be very difficult to resist. Any country’s legislature or courts could demand any one of those elements be expanded. Some of those nations such as China represent hard to refuse markets, and Apple is already making compromises.