Apple on Thursday provided its fullest explanation yet for last year abandoning its controversial plan to detect known Child Sexual Abuse Material (CSAM) stored in iCloud Photos.
Apple's statement, shared with Wired and reproduced below, came in response to child safety group Heat Initiative's demand that the company "detect, report, and remove" CSAM from iCloud and offer more tools for users to report such content to the company.
"Child sexual abuse material is abhorrent and we are committed to breaking the chain of coercion and influence that makes children susceptible to it," Erik Neuenschwander, Apple's director of user privacy and child safety, wrote in the company's response to Heat Initiative. He added, though, that after collaborating with an array of privacy and security researchers, digital rights groups, and child safety advocates, the company concluded that it could not proceed with development of a CSAM-scanning mechanism, even one built specifically to preserve privacy.
"Scanning every user's privately stored iCloud data would create new threat vectors for data thieves to find and exploit," Neuenschwander wrote. "It would also inject the potential for a slippery slope of unintended consequences. Scanning for one type of content, for instance, opens the door for bulk surveillance and could create a desire to search other encrypted messaging systems across content types."
In August 2021, Apple announced plans for three new child safety features, including a system to detect known CSAM images stored in iCloud Photos, a Communication Safety option that blurs sexually explicit photos in the Messages app, and child exploitation resources for Siri. Communication Safety launched in the U.S. with iOS 15.2 in December 2021 and has since expanded to the U.K., Canada, Australia, and New Zealand, and the Siri resources are also available, but CSAM detection never ended up launching.
Apple initially said CSAM detection would be implemented in an update to iOS 15 and iPadOS 15 by the end of 2021, but the company postponed the feature based on "feedback from customers, advocacy groups, researchers, and others." The plans were criticized by a wide range of individuals and organizations, including security researchers, the Electronic Frontier Foundation (EFF), politicians, policy groups, university researchers, and even some Apple employees.
Apple's latest response to the issue comes at a time when the encryption debate has been reignited by the U.K. government, which is considering plans to amend surveillance legislation that would require tech companies to disable security features like end-to-end encryption without telling the public.
Apple says it will pull services including FaceTime and iMessage in the U.K. if the legislation is passed in its current form.
Note: Due to the political or social nature of the discussion regarding this topic, the discussion thread is located in our Political News forum. All forum members and site visitors are welcome to read and follow the thread, but posting is limited to forum members with at least 100 posts.
Top Rated Comments
The slippery slope commences. Child abuse, so easy to put in place the infrastructure for censoring.
Then came the clamor for "hate speech". Then Thought Crime.
No. Child abuse is illegal and only a very few people are involved. The wider privacy issue is far more important.
Just read 1984 (the Stephen Fry audiobook is best) to see the end result which we can already see happening.
Yes, CSAM is the absolute worst of humanity, and people who cause/encourage it deserve to be punished. But not at the expense of ruining privacy for the entire rest of the human population.
It was a knee jerk reaction that sounded good on paper and I’m assuming Apple corporate bureaucracy (like most companies) meant no one spoke up.
See 1984 and Brave New World. They treat them like instructions, not warnings.
First they came for the ('https://en.wikipedia.org/wiki/First_they_came_...')socialists, and I did not speak out—
Because I was not a socialist.
Then they came for the trade unionists, and I did not speak out—
Because I was not a trade unionist.
Then they came for the Jews, and I did not speak out—
Because I was not a Jew.
Then they came for me—and there was no one left to speak for me.
With the rise of populist driven one-trick-pony political movements, it is truly great to see Apple's stance. Privacy is vital as is the right to free speech.