Accompanying a report at a job the company worked on creating a tool that scans the iPhone for child abuse images, Apple has published a post which provides more details about his child safety efforts. With the release of iOS 15, watchOS 8 and macOS Monterey later this year, the company says it will introduce various child protection features in messages, photos and Siri.
For starters, the Messages app will include new notifications that will alert children, as well as their parents, when they either send or receive sexually explicit photos. When someone sends an inappropriate picture to a child, the app will blur it and display a few warnings. “It’s not your fault, but sensitive photos and videos can be used for injury,” says one of the notices, according to a screenshot shared by Apple.
As an added precaution, the company says Messages can also notify parents if their child chooses to look at a sensitive photo. “Similar protections are available if a child tries to send sexually explicit photos,” Apple states. The company notes that this feature uses machine learning on the device to determine if a photo is explicit. Moreover, Apple does not have access to the messages themselves. This feature will be available to family iCloud accounts.
Apple will also introduce new software tools in iOS and iPadOS that will allow the company to detect when someone uploads content to iCloud that shows children involved in sexually explicit acts. The company says it will use the technology to notify the National Center for Missing and Exploited Children (NCMEC), which in turn will work with law enforcement agencies across the United States. “Apple ‘s method of detecting the famous CSAM [Child Sexual Abuse Material] is designed with user privacy in mind, ”the company claims.
Instead of scanning photos when they are uploaded to the cloud, the system will use a database of “known” images on the device provided by NCMEC and other organizations. The company says the database is unreadable with the help of image hashing, which turns photos into a kind of digital fingerprint.
Cryptographic technology called private set intersection allows Apple to determine if there are any matches without seeing the result of the process. In the event of a match, the iPhone or iPad will create a cryptographic security voucher that will encrypt the transfer, along with additional information about it. Another technology called secret threshold sharing allows a company not to see the contents of security vouchers unless someone loads an indefinite CSAM content threshold. “The threshold is set to provide an extremely high level of accuracy and provides less than one in a trillion chances per year for mislabeling a particular account,” the company said.
In development …
All products recommended by Engadget have been selected by our editorial team, independent of our parent company. Some of our stories involve partnerships. If you buy something through one of these links, we can earn a commission for affiliates.