Apple’s privacy mythology doesn’t match reality

0
26


In 2021. Apple declared himself the world’s superhero of privacy. His leadership insists “Privacy has been key to our work … from the very beginning” and that is “basic human right. “ It’s new advertising It can even boast that privacy and the iPhone are the same things. Last spring, the introduction of a software update (iOS 14.5) that allows users to say no to apps that monitor their online activity showed something important: People choose privacy when they don’t have to struggle to control their data. Only now 25 percent user consent, but previously nearly 75 percent agreed due to the failure of their information to encourage targeted advertising. Because Apple plans to add more privacy protection to iOS15, which it will be published next month, continues to be branded as a force potentially capable slowing down fall growth on Facebook, a role model supervision capitalism. Unfortunately, Apple’s promises of privacy do not show the full picture.

The most alarming failure of the company’s privacy could be one of its most profitable: iCloud. Over the years, cloud storage has further strengthened hundreds of millions of Apple users in its ecosystem, an Internet-enabled extension of your hard drive designed to effortlessly transfer photos, movies, and other files to your unseen backup drive. Unfortunately, iCloud makes it almost as easy for the police to access all of these files.

Apple has insisted in the past that it will not weaken the security of its own rear-door installation devices. But with older devices, the doors are already built. According to Apple’s law enforcement manual, anyone who has iOS 7 or older has no luck if they fall into police or ICE. With a simple account, Apple will unlock the phone. This may seem like a straight course in Silicon Valley, but the directors of most technology giants have not previously declared that orders for their devices compromise “the data security of hundreds of millions of law-abiding people … by setting a dangerous precedent that threatens everyone’s civil liberties. ”This service is available due to security vulnerabilities that were eventually resolved in later operating systems.

Since 2015, Apple has withdrawn from the FBI and the Department of Justice ire for each new round of security enhancements build a device that is too secure for even Apple to break into. But the dirty little secret with almost all of Apple’s privacy promises is that there was a back door all along. Whether it’s iPhone data from the latest Apple devices or iMessage data that the company has constantly advocated for “End-to-end encrypted”, all of this data is vulnerable when using iCloud.

Apple’s simple design in terms of holding iCloud encryption keys has created complex consequences. They don’t do that with your iPhone (despite government requests). They don’t do that with iMessage. Some of the benefits of making an exception for iCloud are clear. If Apple hadn’t kept the keys, account users who forgot their passwords wouldn’t be lucky. Truly secure cloud storage would mean that the company itself would be no more capable than a random attacker of resetting your password. Still, retaining that power allows them to use the daunting ability to hand you an entire custom iCloud backup.

iCloud data goes beyond photos and files and includes location data, for example from “find my phone” or AirTags, Apple’s controversial new tracking devices. With one court order, all your Apple devices could be turned against you and create an armed surveillance system. Apple could fix that, of course. Many companies have secure file sharing platforms. Swiss company Treasure offers true “end-to-end encryption” for its cloud service. Tresorite users also see their files uploaded in real time to the cloud, synchronized across multiple devices. The difference is that the encryption keys are held by the users, not Tresorit. This means that if users forget their password, they will also lose their files. But as long as service providers have the power to recover or change passwords, they have the power to hand that information over to the police.

The threat is only growing. Under the new suite of content moderation tools, Apple will scan iCloud uploads and iMessage communications for materials suspected of child sexual abuse. While the company used to search exclusively photographs set to iCloud due to suspicion of CSAM, new tools can now turn any photo and text you send or receive against you. Suppressing CSAM is a noble goal, but the consequences could be catastrophic for those wrongly accused when AI fails. But even when the software works as intended, it could be deadly. As Harvard Law School instructor Kendra Albert noted on Twitter, this “These functions will drive queer children crazy out of their homes, beat them up or worse.” Software launched in the name of “child safety” could be a deadly threat to LGBTQ + children, thrown out by homophobic and transphobic parents. Equally intimidating, tools that are easily used today to track CSAM can be trained to tag political and religious content tomorrow.





Source link

LEAVE A REPLY

Please enter your comment!
Please enter your name here