Breaking News

Apple's controversial plan to scan devices for photos of child sex abuse

Apple's controversial plan to scan devices for photos of child sex abuse

Apple's plan to use image-matching technology on users' devices and iCloud accounts to trace child sexual abuse photos has earned considerable praise from child protection groups. However, privacy and security researchers are uneasy with the plan.

"Apple's expanded protection for children is a game-changer," the Associated Press quoted John Clarke, CEO of the US National Center for Missing and Exploited Children (NCMEC). "With so many people using Apple products, these new safety measures have the potential to save lives for children."

However, the New York Times cited Matthew D. Green, a cryptography professor at Johns Hopkins University, as outright opposed. "They [Apple] are selling privacy to the world and making people trust their devices," he said. "But now they're basically accepting the worst possible demands of every government. I don't see how they're going to say no from here."

The ability to do what Apple calls on-device hash matching will begin later this year with iOS 15, iPadOS 15, watchOS 8 and macOS Monterey.

how will it work
Apple said on Thursday that the technology - for the first time - was only available in the US, not using the term "scanning." Apple says its system performs on-device matching of hashes of images from a database of known child sex abusers. Is. Image hashes provided by NCMEC and other child protection organizations. Apple said it turns this database into an unreadable set of hashes that is stored securely on users' devices.

Before an image taken on an Apple device is stored in iCloud Photos, a matching process is performed on the device against a known child abuse image hash for that image. "This matching process is driven by a cryptographic technique called private set intersection, which determines whether there is a match without revealing the result," Apple says. The device generates a cryptographic security voucher that encodes the matching result with additional encrypted data about the image. This voucher is uploaded to iCloud Photos along with the image.

"Using another technique called Threshold Secret Sharing, the system ensures that the contents of the Security Voucher cannot be interpreted by Apple unless an iCloud Photos account is exposed to known CSAM (Child Sexual Abuse Content) content." Do not exceed the limit." The threshold is set to provide an extremely high level of accuracy and ensures that there is a less than one in trillion chance per year of a given account being falsely flagged.

“Only when the threshold is exceeded does cryptographic technology allow Apple to interpret the contents of matching security vouchers from CSAM images. Apple then manually reviews each report to confirm that a match has been made.” Disables the user's account, and sends a report to NCMEC. If a user believes that their account has been flagged in error, they can file an appeal to restore their account "

Apple says the plan allows NCMEC and law enforcement to provide actionable information about exploitative child images that includes privacy protections. “Apple only learns about users' photos if they have a collection of known CSAMs in their iCloud Photos account. Even in these cases, Apple only learns of images that match known CSAMs," it says.

Apple has published a technical description of how the system works on its website.

New tools in Messages
In addition, Apple said its Messages app will add new tools to warn children and their parents when they receive or send sexually explicit photos.

Upon receiving this type of content, the picture will be blurred and the child will be warned, presented with helpful resources, and reassured that it is okay if they don't want to see this photo, Apple said. The child may also be told that, to make sure they are safe, their parent will receive a message upon seeing it. Similar protections are available if a child attempts to send sexually explicit photos. The child will be warned before the photo is sent, and the parent can receive a message if the child wants to send it.

Messages uses machine learning on the device to analyze image attachments and determine whether a photo is sexually explicit. Apple said the company would not get access to Messages.

TechCrunch notes that most cloud services, including Dropbox, Google and Microsoft, already scan user files for content that may violate their terms of service or be potentially illegal. But Apple said, Apple has long resisted scanning users' files in the cloud, giving them the option of encrypting their data before accessing Apple's iCloud servers.

No comments