WhatsApp says it will not scan your photos for child abuse

0
24


A picture for an article titled WhatsApp says it won’t scan your photos for child abuse

Photography: Carl Court / Staff (Getty Images))

Apple’s new highlighting tool potential child abuse in iPhone photos it is already causing controversy. On Friday, just a day after it was announced, Will Cathcart, head of Facebook’s messaging app, WhatsApp, said the company would refuse to adopt the software because it introduced a number of legal and privacy issues.

“I read the information that Apple released yesterday and I am concerned. I think this is a wrong approach and a setback for the privacy of people all over the world, ”Cathcart wrote on Twitter. “People asked if we would adopt this system for WhatsApp. The answer is no. ”

In a series of tweets, Cathcart elaborated on the concern, citing the ability of spy government governments to co-opt software and the potential of software without a trace to violate privacy.

“Could this scanning software running on your phone be proof of an error? “Researchers were not allowed to find out,” he wrote. “Why not? How do we know how often mistakes violate people’s privacy?”

In a software announcement on Thursday, Apple said it had scheduled an update for the late 2021 release as part of a series of changes the company planned to implement to protect children from sexual predators. Like Gizmodo previously reported, a proposed tool – which would use “neural matching function“, Called NeuralHash to determine whether the images on the user’s device match the known fingerprints of child sexual abuse material (CSAM) – has already caused some astonishment among security experts.

In August 4 tweet thread, Matthew Green, Associate Professor at Johns Hopkins Institute for Information Security, warned that the tool could eventually become a precursor to “adding surveillance to encrypted messaging systems”.

“I had independent confirmation from several people that Apple is releasing a client tool for CSAM scanning tomorrow. This is a really bad idea, ”Green said tweeted. “These tools will allow Apple to scan your photos on iPhone for photos that match a certain perceptual scatter and report them to Apple’s servers if it appears too much.”

But according to Apple, Cathcart’s characterization of the software used to “scan” the device is not very accurate. Although the scan implies a result, the company said, the new software will only trigger a comparison of images that a particular user chooses to upload to iCloud using the NeuralHash tool. The results of that scan would be contained in a cryptographic security voucher – basically a bag of data bits that can be interpreted on the device – and the contents of that voucher should be sent to be read. In other words, Apple would not collect any data from the photo libraries of individual user libraries as a result of such a scan – unless they collected stocks of child sexual abuse material (CSAM).

According to Apple, although there is a possibility of misreading, the rate of users who are falsely sent for manual review would be less than one in 1 trillion a year.





Source link

LEAVE A REPLY

Please enter your comment!
Please enter your name here