Breaking News

Apple scrapes controversial CSAM detection feature from webpage but says plans haven't changed

Apple scrapes controversial CSAM detection feature from webpage but says plans haven't changed




Apple has updated a webpage on its child protection features to remove all references to the controversial Child Sexual Abuse Content (CSAM) detection feature first announced in August. The change, which was spotted by MacRumors, appears to have happened between December 10 and December 13. But despite changes to its website, the company says its plans for the feature haven't changed.

Two of the three security features that were released with iOS 15.2 earlier this week are still present on the page, titled "Extended Protection for Kids." However, references to the more controversial CSAM detection, whose launch was delayed after protests by privacy advocates, have been removed.

When reached for comment, Apple spokesman Shane Bauer said the company's position had not changed since September, when it first announced it would delay the launch of CSAM Identity. The company's September statement said, "Based on feedback from customers, advocacy groups, researchers, and others, we will continue to gather input over the coming months and take additional steps to make improvements before these critical child safety features are released." Decided to take the time."

Importantly, Apple's statement does not say that the feature has been canceled altogether. Documents explaining how the functionality works are still live on Apple's site.

Apple's CSAM detection feature was controversial when it was announced because it involves taking a hash of iCloud Photos and comparing them to a database of hashes of known child sexual abuse imagery. Apple claims that this approach allows users to report them to authorities if they are known to upload child abuse imagery without compromising their customers' privacy. It also says that encryption of user data is not affected and that the analysis must be run on the device.

But critics argue that Apple's system runs the risk of undermining Apple's end-to-end encryption. Some have referred to the system as a "backdoor" that governments around the world can strengthen Apple to include content beyond CSAM. For its part, Apple has said it will "not accept any government request to expand it" beyond CSAM.

While the CSAM detection feature has yet to receive a new launch date, Apple has released two other child-protection features it announced in August. One is designed to warn children when they receive nudity images in messages, while the other provides additional information when searching for terms related to child abuse through Siri, Spotlight or Safari Search. Both rolled out with iOS 15.2, which was released earlier this week and appears to have prompted Apple to update its webpage.

No comments