Mar 11, 2014 ยท 4 minutes

On Saturday, after weeks of condemning Apple for failing to implement basic security standards and preparing to drive deeper into "Surveillance Valley," I bought an iPhone. That might seem hypocritical -- and, in some sense, it is -- but I have good reasons.

I purchased the iPhone knowing that doing so would require placing my trust in an inept company that would have access to my communications, location, and thumbprint. Unless the company reveals that iPhones provide a direct line to the National Security Agency, which I doubt, I think that many of the privacy implications of owning one of Apple's devices are clear.

The same can't be said of the Android smartphone I returned. It allowed every app I downloaded to access a treasure trove of personal information without clearly explaining what exactly they were gathering and why they needed to collect it. Following the removal of an experimental feature that let Android users disallow certain permissions, a move condemned by the Electronic Frontier Foundation, using these apps was an all-or-nothing proposition.

But the iPhone offers more granular control over what information apps can access. I have prevented many apps from monitoring my location, sifting through my communications, or peering into my contacts list. I have also flipped the switch that limits the access offered to advertisers, set a passcode on sensitive apps, and avoided reportedly compromised software. Even though I can't trust Apple to keep my device secure, I know that I can place more faith in the company than I would in Google or any of the manufacturers modifying its platform.

Still, the thumbprint scanner gave me pause. As David Sirota explained at NSFWCORP in September, access to biometric information shouldn't be offered lightly:

Sure, it might be no big deal to accept the vulnerabilities of a fingerprint scan if it gave you access to your iPhone and nothing else. In that situation, only your phone would be susceptible to biometric hacks - and your information would be no less secure than it would be had you used a hackable password for that phone. So sure, right now, you can probably use your new iPhone's fingerprint scanner without much worry.

However, when the success of the iPhone inevitably leads to a future in which lots of different technologies in your life are locked and unlocked by a finite number of biometrics, then far more than your phone is at risk. The scale of such biometric security systems would mean your whole life could be held hostage because the locks and keys have been fundamentally changed. Apple is only the beginning. Other companies will introduce products that gather information about our bodies, and they will probably do so with the promise of added convenience. Anyone worried about having their personal information stolen or providing access to sensitive data to anyone who compromises their thumbprint should be worried about Apple's built-in scanner.

I ended up using the scanner anyway -- not because I trust Apple, and not because I don't understand the risks associated with storing biometric info on a potentially insecure device, but because I don't think that Apple or any other company would build a fingerprint scanner into one of their products and not use it at some point. The capability is there, which means that someone, at some point, will probably flip the switch and collect that information. I'd prefer for it to be done with my consent so I can at least reap the benefits before I curse the day I decided not to burn my fingerprints off before playing the latest "Flappy Bird" clone.

Consider it an extension of security expert Bruce Schneier's metaphor contrasting surveillance by dogs and surveillance by computers:

When you're watched by a dog, you know that what you're doing will go no further than the dog. The dog can't remember the details of what you've done. The dog can't tell anyone else. When you're watched by a computer, that's not true. You might be told that the computer isn't saving a copy of the video, but you have no assurance that that's true. You might be told that the computer won't alert a person if it perceives something of interest, but you can't know if that's true. You do know that the computer is making decisions based on what it receives, and you have no way of confirming that no human being will access that decision.
Apple can't be trusted not to collect my personal information without my consent, to secure my communications, or to protect private data accessed through one of its products. One can usually tell when a dog is about to bite them -- the same can't be said of smartphones and PCs.

Buying an iPhone was a risk. Installing applications that allow me to monitor my daily activity was a risk. Basically, connecting to the Internet in any way is a risk, and the only thing I can do about it is try to be mindful of how I use those technologies and take advantage of the few security and privacy measures afforded to me. The iPhone, in addition to having more apps and fewer problems, offers more control than the Android smartphone I left behind.

[Image via jtjdt]