Apple Formally Unveils “Extended Child Protection Measures”

Let’s figure out how this new system works and how and to whom it really threatens.

Apple has posted some pretty detailed information on a dedicated page dedicated to this new system. What does it say there?

1. Security of communication in Messages
The Messaging app will have new tools to alert kids and their parents when they receive or send sexually explicit photos.

Upon receipt of this kind of content, the photo will be blurred and the child will be alerted, offered helpful resources, and given reassurance that everything is fine if they do not want to view the photo. As an added precaution, the child may also be told that, to ensure their safety, their parents will be notified if they do view the photo.

Similar protections are provided if a child attempts to submit photographs of a sexual nature. The child will be warned before submitting the photo, and the parents can receive a message if the child decides to send it.

Messages use machine learning on the device to analyze image attachments and determine if a photo is sexually explicit. The feature is designed in such a way that Apple cannot access messages.

2. Detection of CSAM (Child Sexual Abuse Material)
Another major issue is the online dissemination of child sexual abuse materials (CSAM). CSAM refers to content depicting sexually explicit acts involving a child.

To address this issue, the new technology will allow Apple to detect known CSAM images stored in iCloud Photos. This will allow Apple to report such cases to the National Center for Missing and Exploited Children (NCMEC). NCMEC acts as a comprehensive CSAM reporting center and works in collaboration with law enforcement agencies throughout the United States.

Apple’s known CSAM detection method is designed with user privacy in mind. Instead of scanning images in the cloud, the system performs on-device matching using a hash database of known CSAM images provided by NCMEC and other child safety organizations. Apple then converts this database into an unreadable set of hashes that is securely stored on users’ devices.

Before an image is saved to iCloud Photos, the device performs a process of matching that image with known CSAM hashes. This matching process is carried out using a cryptographic technique called partial set intersection, which determines if there is a match without revealing the result. The device generates a cryptographic security voucher that encodes the result of the match along with additional encrypted image data. This voucher is uploaded to iCloud Photos along with the image.

Using another technology called secret sharing threshold, the system ensures that the content of the security vouchers cannot be interpreted by Apple until the iCloud Photos account crosses the threshold of known CSAM content. The threshold is set to provide an extremely high level of accuracy and guarantees less than one chance in one trillion a year that a given account will be mistakenly tagged.

Only when the threshold is exceeded does cryptographic technology allow Apple to interpret the contents of the security vouchers associated with the associated CSAM images. Apple then manually checks each report to confirm a match, disconnects the user account, and sends the report to NCMEC. If the user believes that his account was mistakenly tagged, he can appeal for his account to be restored.

3. Advanced tips in Siri and Search
Apple is also expanding guidance in Siri and Search, providing additional resources to help kids and parents stay safe online and get help in insecure situations. For example, users who ask Siri how they can report CSAM or child exploitation will be directed to resources where and how to apply.

Siri and Search are also being updated to intervene when users search for CSAM-related queries. These interventions will educate users that interest in the topic is harmful and problematic, and provide partner resources to help them solve the problem.

The Free News editors would like to draw your attention to several important points:

  • This is not about scanning all the photos that are stored on users’ devices. There is a database of hashes of already known photographs depicting child abuse. Accordingly, only those photos in the hash of which a match with this base is found will be compared.
  • If a hash match is found, your photo will indeed be sent to the live controller for viewing. But! Inspectors will not see the full image, but only individual elements that allow judging whether or not they match with a photo from the CSAM database.
  • The actual scanning takes place on the user’s device, even before the photos are uploaded to the iCloud library.
  • You will only be reported as an offender when you exceed a certain threshold for having child abuse photos on your device.
  • This whole system will only start working with the release of iOS 15 and only in the US (for now). Moreover, if iCloud Photo Library is disabled, scanning will not be performed on client devices at all.

Afterword
Apple has long been known for being zealous in protecting user data. Several years ago, the company had a chance to fight off the FBI in court, and now Tim Cook is in open confrontation with the head of Facebook, Mark Zuckerberg. Of course, hashing technology is not ideal, but at this stage of technology development, nothing better to balance between the protection of personal data and the need to toughen the persecution of pedophiles, unfortunately, has not been invented.

Most importantly, Facebook, Dropbox, Google, Microsoft and others that provide cloud storage for user photos have long and reliably scanned them using the same base that Apple has now connected to. Dropbox reported 21,000 similar hits in 2020, while Facebook posted a staggering 15 million hits back in 2019. The only difference is that all these companies scan photos in the cloud, while Apple is implementing a scanner at the operating system level.

Apple posted a detailed description of the entire process in the public domain at the link (PDF, 2 MB).

If you have found a spelling error, please, notify us by selecting that text and pressing Ctrl+Enter.

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Alexandr Ivanov earned his Licentiate Engineer in Systems and Computer Engineering from the Free International University of Moldova. Since 2013, Alexandr has been working as a freelance web programmer.
Function: Web Developer and Editor
Alexandr Ivanov

Spelling error report

The following text will be sent to our editors: