Apple says it will not allow governments to co-opt CSAM detection tools


An image for an article titled Apple says it won't win if the government turns its child abuse detection tools into surveillance weapons

Photography: GIUSEPPE CACACE / AFP (Getty Images))

After much criticism, Apple doubled up and defended its launch plans controversial new tools whose goal is to identify and report child sexual abuse material (or CSAM) on their platforms.

Last week, the company announced several pending updates, listing them in a blog titled “Extended child protection. ”These new features, which will be introduced later this year with the release of iOS 15 and iPadOS 15, are designed to use algorithmic scanning to search for and identify child abuse material on user devices. One tool will scan photos on the device that are shared with iCloud to detect signs of CSAM, while another feature will scan iMessages sent to and from children’s accounts in an effort to prevent minors from sharing or receiving messages that include sexually explicit images. We’ve taken a closer look at both features and concerns around them here.

The company barely had time to announce its plans last week before it was introduced to it a loud shout from civil liberties organizations, which characterized the proposed changes as well-intentioned, but ultimately sliding toward a dangerous erosion of personal privacy.

Apple on Monday posted a response to the many concerns that have been raised. The company explicitly denied that its scanning tools could one day be repurposed to search for other types of material on users ’phones and computers besides CSAM. Critics are worried that the government (ours or someone else’s) could pressure Apple to add or change new features – to make them, for example, a broader law enforcement tool.

However, in the rare cases where the corporation has firmly promised not to do anything, Apple has definitely said it will not expand the range of its scanning capabilities. According to the company:

Apple will reject such requests [from a government]. Apple’s ability to detect CSAM is built solely to detect known CSAM images stored in iCloud photos identified by experts from NCMEC and other child safety groups. We have faced requests to build and implement changes that are prescribed by the government and that violate user privacy, and we have resolutely rejected those requests. We will reject them in the future.

During a subsequent question-and-answer session with reporters on Monday, Apple further clarified that these features are now only running in the U.S. While some concerns have been expressed about whether a foreign government could corrupt or undermine these new tools to use them as a form of surveillance, Apple said Monday it would carefully conduct a country-by-country legal assessment before releasing the tools overseas. to ensure that there is no possibility of abuse.

Understandably, this whole thing has confused many people, and questions are still being asked about how these features will actually work and what it means for your privacy and device autonomy. Here are a few things Apple recently clarified:

  • Oddly enough, iCloud has to be activated for its CSAM detection function to really work. There has been confusion on this issue, but Apple is basically only looking for content that is shared with its cloud system. Critics pointed out that this would make it extremely easy for abusers to avoid the informal network set up by Apple, as all they would have to do to hide CSAM content on their phone would be to opt out of iCloud. Apple said Monday it still believes the system will be effective.
  • Apple does not upload a child pornography database to your phone. Another thing the company was forced to clarify on Monday is that it won’t actually download the actual CSAM to your device. Instead, it uses a database of “hashes” – digital fingerprints of certain, well-known images of child abuse, which are presented as numerical code. This code will be loaded into the phone’s operating system, which allows automatic comparison of images uploaded to the cloud with hashes in the database. However, if they don’t match, Apple doesn’t care about them.
  • iCloud will not only scan new photos – it plans to scan all the photos currently in its cloud system. In addition to scanning photos that will be uploaded to iCloud in the future, Apple also plans to scan all photos currently stored on its cloud servers. During a conversation with reporters on Monday, Apple reiterated that this was the case.
  • Apple claims that the iMessage update does not share any information with Apple or the police. According to Apple, the updated iMessage feature does not share your personal information with the company, nor does it alert the police. Instead, it only warns parents if their child has sent or received a text image that Apple’s algorithm considered sexual in nature. “Apple never gains access to communications as a result of this feature in Messaging. This feature does not share any information with Apple, NCMEC or law enforcement, ”the company said. This feature is only available for accounts that are set up as families in iCloud, the company says.

Despite the guarantees, advOcates and security experts are still not great impressed – and some are more than a little upset. In particular, on Monday, renowned security expert Matthew Green set up the following hypothetical scenario — which was controversial enough to inspire a minor argument on Twitter between Edward Snowden and former Facebook security chief Alex Stamos in the answer section:

So, suffice it to say that many people still have questions. Here we are all in a rather unfamiliar, messy territory. Although it is impossible to break the point of Apple’s mission, the power of the technology it implements has caused, to say the least, alarm.

Source link


Please enter your comment!
Please enter your name here