X
Tech

The iPhone security model is broken ... can it be fixed?

User data is the new gold rush, and it's so easy to find and mine.
Written by Adrian Kingsley-Hughes, Senior Contributing Editor

I like my iPhone. A lot. But I've not gotten to the point where I feel that the security model that Apple chose to implement in iOS is broken, and it's hard to see how it can be fixed in any useful or meaningful way.

How is the security model broken? Well, it's broken because the apps you install onto your iDevices are capable of accessing the user's address book and sending that data back to the company servers without you knowing that it's going on. Mobile social network Path was caught doing just that, and I'm sure that it's not the only one that's been up to this trick. User data is the new gold rush, and it's so easy to find and mine.

Note: Mac OS X offers developers easy access to the address book, and Apple hasn't done anything about this since the issue surfaced in 2006.

Now I'm going to assume for a moment that there are legitimate reasons for access user's address books and copying them, but what cannot be justified is doing this without user consent (and by consent I don't mean a small snippet of legalese buried in a ocean of legalese). Harvesting data without clearly informing the user of what's going to be done and what will happen to that data is at best a very bad business practice, and at worse it's malware-like behavior and a massive breach of trust.

So what should happen? Well, I have several ideas, but I must admit that I'm not in love any of them.

  • Apple could ban apps that access the address book. This would be easy to do as Apple controls what APIs developers can use, and checks for developers breaking the rules. While Apple could do this easily, but it's not an ideal situation because some apps could have legitimate reasons for accessing this data.
  • Apple could restrict how much data apps can access. Problem with this is that it doesn't give users much control. It's too blackbox and too opaque.
  • Apple could put policies in place to force apps to use encryption when transmitting the data, but personally I'm more concerned about what happens to that data after transmission than during transmission.
  • Apple could put a mechanism in place similar to that for Locations Services where apps would have to ask permission and users could revoke permission from the app later. Of all the options this seems like he best, but it does have a danger in that it could eventually mean that iOS users are faced with endless dialog boxes and a torrent of questions each time they install apps. This sort of security hasn't worked on any platform previously, and I'm not convinced that it would work on iOS.

As I said, none of these solutions are ideal, but in light of recent developments, it's clear that Apple can't just allow apps to have unfettered access to data stored on iOS devices. We're already sliding down a very slippery slope.

The best option in my opinion is to put the users in charge, but I see there being a giant gulf between giving the users choice, and the users making an informed choice. On platforms like Windows (and even on Mac) throwing endless dialog boxes at users quickly creates a fatigue where people don't really read (or even pay attention to) the information being put in front of them. Security turns from being a useful feature into something that's just standing between them and doing what they want.

These aren't new problems, but they're exaggerated by post-PC devices. In the PC world there's a huge amount of diversity when it comes to software. People could have their contacts in one (or many) of dozens of places (Outlook, Thunderbird, in the cloud, in a Notepad file ...), but on a device like the iPhone there's one place ... the Contacts app. Also, as our devices become more personal, they will contain more and more personal data (names, emails addresses, phone numbers, addresses, and so on).

All this makes the data easy pickings ... and this data is valuable stuff, so there are people who will grab it.

The move to the post-PC world is putting out personal data at risk, and no one has come up with a solution that protects us from the bad guys.

Editorial standards