Apple explains its CSAM-scanning about-face: ‘Scanning for one type of content… opens the door for bulk surveillance’

Last December, after much opposition, including, voluminously, from MacDailyNews, Apple killed an effort to design an iCloud photo scanning tool for detecting child sexual abuse material (CSAM) in the storage service by introducing Advanced Data Protection for iCloud, which uses end-to-end encryption to provide Apple’s highest level of cloud data security, users have the choice to further protect important iCloud data, including iCloud Backup, Photos, Notes, and more.

iPhone News - 02-09-2023 13:18

This week, a new child safety group known as Heat Initiative told Apple that it is organizing a campaign to demand that the company “detect, report, and remove” child sexual abuse material from iCloud and offer more tools for users to report CSAM to the company.

Today, in a rare move, Apple responded to Heat Initiative, outlining its reasons for abandoning the development of its iCloud CSAM scanning feature and instead focusing on a set of on-device tools and resources for users known collectively as “Communication Safety” features…

“Child sexual abuse material is abhorrent and we are committed to breaking the chain of coercion and influence that makes children susceptible to it,” Erik Neuenschwander, Apple’s director of user privacy and child safety, wrote in the company’s response to Heat Initiative. He added… “Scanning every user’s privately stored iCloud data would create new threat vectors for data thieves to find and exploit,” Neuenschwander wrote. “It would also inject the potential for a slippery slope of unintended consequences. Scanning for one type of content, for instance, opens the door for bulk surveillance and could create a desire to search other encrypted messaging systems across content types.”

MOST READ