Sep 18, 2014 · 4 minutes

In the wake of criticism related to Labor Day Weekend's celebrity photo leak, Apple has published a detailed look at its built-in security tools, the government's requests for customer data, and its thinking on how much information should be shared with other parties.

The information contained within is much more detailed than what was offered during Tim Cook's interview with Charlie Rose, and the company's efforts to increase awareness for security issues and governmental limitations on so-called "transparency reports" is commendable, but it seems like Apple is still offering consumers a rose-tinted look at the efficacy of its security measures.

Consider the company's claims that it "doesn’t scan your communications, and we wouldn’t be able to comply with a wiretap order even if we wanted to." That would be great... if its mobile operating system developed specifically for vehicles didn't include a handy feature that can determine where you might want to travel based on your personal information. As I wrote when that vehicle-specific software was first announced in March,

Apple is driving deeper into Surveillance Valley. The company yesterday announced CarPlay, a tool that allows drivers to interact with its mapping, messaging, and music services via their car’s built-in controls. The tool’s flagship feature is its ability to “predict where you most likely want to go using addresses from your email, text messages, contacts, and calendars.”

Basically, this means that Apple will sift through your digital communications and personal data to save you the agony of entering an address. In doing so, the company might just create a service that will allow it to escape from cartographic purgatory — all it has to do is convince you that your privacy is less important than your laziness. So it's clear that Apple is either lying about its ability to scan and use its customers' data or it lied about one of the features of its vehicle platform. It can't have it both ways.

There are also some concerns about the encryption Apple uses to secure iMessages. Though it says that it couldn't hand over its customers' messages even if it wanted to, researchers have cast some doubt on the encryption methods used to secure those communications, as the Guardian first reported in October 2013. The company has refuted those claims, but given the fact that encrypting messages is apparently much harder than it seems, some caution is needed.

The company says that it's addressed those problems by ensuring it doesn't have the key used to decrypt its customers' personal information, but it can still provide the government with data backed up to its iCloud service. When the option of backing data up to iCloud is among the first things someone sees when they purchase an iPhone or update their devices to iOS 8, that small loophole suddenly seems much bigger.

Apple also trumpets its thumbprint-scanner, which has been expanded for use in everything from its App Store to individual applications and the upcoming Apple Pay service, as an extra layer of security between would-be hackers and a consumer's iPhone. That's true now, but it might not be seen as such a boon in the future, as David Sirota explained at NSFWCORP:

Think about it in practical terms. Whereas in today's password-based system you can protect yourself after a security breach with a simple password change, in tomorrow's biometric-based system, you have far fewer - if any - ways to protect yourself after a security breach. That's because you cannot so easily change your fingers, your eyes or your face. They are basically permanent. Yes, it's true - security-wise, those biological characteristics may (and I stress "may") be less vulnerable to a hack than a password. But if and when they are hacked in a society reorganized around biometric security systems, those systems allow for far less damage control than does a password-based system. In effect, your physical identity is stolen - and you can't get it back.

In light of this, there's a simple question: is the time it takes to punch in a code worth the hassle? I'd say two seconds and some finger flicks isn't actually a hassle, whether it’s punching in an iPhone code, typing in a login password or entering an ATM pin number. More importantly, it sure isn't a very high price to prevent your body from becoming someone else's permanent skeleton key to your whole life. It's heartening to see Apple take such a comprehensive approach to informing its customers about its privacy policies and security practices. The new section of its website offers more information than the transparency reports most companies have relied on, and Cook has taken a strong public stance against the government's attempts to gather so much personal data. The updates to iOS 8, which are supposed to make it next to impossible for data to be compromised, are also welcome even though Apple's track record with even basic security features is mixed.

But this is still an incomplete look at what Apple does to protect its customers' data, and until these issues are clarified or the company learns to respond to fears about its iffy security tools before it makes the biggest product announcement since it revealed the original iPad in 2010, the company should still be questioned and criticized. This is our data, and until we take an active role in holding companies responsible for their actions, we're never going to be secure.

[illustration by Brad Jonas for Pando]