Apple is not going to cancel the initiative to analyze photos on devices. In order for the account of the owner of the iPhone to be suspected, the analysis system must produce at least 30 matches of user photos with samples of images of child abuse uploaded to the database.
Apple has already held its fourth briefing on the planned implementation of a system for analyzing images uploaded to the cloud from user devices. The company admitted that it did a poor job of clarifying its initiative, which drew sharply backlash from the public. Even her own employees are now confident that the manufacturer is risking the reputation of a brand that has always stood guard over customer privacy.
Privacy concerns are groundless, Apple officials are confident. User photos will be analyzed by comparing them with samples entered into the database. Samples will be collected from several sources – specialized non-profit organizations located in at least two countries. A circle of persons will be established who will have the opportunity to check the database of samples, and this will exclude the possibility of using the system for any other purposes.
Verification starts when the user tries to upload their photos to iCloud. Apple claims that an account falls under suspicion if there are more than 30 positive positives to the system. In this case, an additional verification procedure is launched by the company’s employees, and if the suspicions are confirmed, the appropriate authorities are notified.
The company representatives also clarified that in the future, when the user content analysis system is improved, this number may be reduced. Apple did not specify whether public criticism influenced the initiative, but noted that the project is still under development, and much may still change before release.