Apple reportedly plans to start scanning iPhones in the U.S. to find images of child abuse

0
24


Apple is reportedly planning an update that will allow it to scan the iPhone for images of child sexual abuse. According to , the company reported to security researchers about the “neuralMatch” system, which will “continuously scan photos stored on the US user’s iPhone, which are also uploaded to his iCloud security system.”

The system would “proactively alert the team of people reviewing if they believe illegal images have been detected,” and people would alert them to law enforcement if the images are checked. The neuralMatch system, which was trained using the National Center for Missing and Exploited Children database, will be limited to launching in the iPhone in the United States, the report said.

The move would be somewhat troubling for Apple, which has previously opposed police protecting user privacy. The company famously clashed with the FBI in 2016 after it refused to unlock an iPhone belonging to the man behind the terrorist attack in San Bernardino. CEO Tim Cook said at the time that it was a government request and would have far-reaching consequences that could effectively create a back door for greater government oversight. (FBI at the end an external security company to unlock the phone.)

Now, security researchers are raising similar concerns. While there is widespread support for increasing efforts to combat child abuse, the researchers spoke with FT he said it could open the door for authoritarian regimes to spy on their citizens, as a system designed to detect one type of image could be extended to other types of content, such as terrorism or other content perceived as “anti-government”.

At the same time, Apple and other companies have faced increasing pressure to find ways to cooperate with the police. As the report points out, social media platforms and cloud storage service providers like iCloud already have systems for detecting images of child sexual abuse, but extending such efforts to images on the device would be a significant step forward for the company.

Apple declined to comment FT, but the company could release more details about its plans “as early as this week.”

Update 8/5 16:00 ET: Apple has confirmed its plans start testing a system that could detect images of child sexual abuse stored in iCloud Photos in the United States. “Apple’s method of detecting known CSAM is designed with user privacy in mind. Instead of scanning images in the cloud, the system synchronizes on the device using a database of known CSAM image hashes provided by NCMEC and other child protection organizations,” the company said in a statement. . “Apple is further turning this database into an unreadable set of dispensers that are securely stored on users’ devices.”

The update will be released later, along with several other child protection features, including new parental controls that may detection of explicit photographs in children’s messages.

All products recommended by Engadget have been selected by our editorial team, independent of our parent company. Some of our stories involve partnerships. If you purchase something through one of these links, we can earn a commission for affiliates.



Source link

LEAVE A REPLY

Please enter your comment!
Please enter your name here