"To help make it easier to understand what an app will have access to, the Play Store has recently made improvements to how permissions are displayed," writes Google, explaining its new "simplified" permissions model for Android applications. "This information can help you make an informed decision more easily on whether you would like to install the app."
Except that it doesn't. It's broken. It's more like a dumbing down than a simplification. And it deliberately introduces a way for apps to change their permissions without notifying the user and gaining specific consent. It's just wrong.
Google's heart, if it has one, may have been in the right place when they started work on this. App permissions are notoriously hard to understand, even for technically aware users — something that Facebook seems to have turned into a business model. If users are confronted with a long list of options to review, they won't read it, let alone understand it — which means they can't really be said to be giving informed consent.
Google has therefore created 13 "permissions groups", one for each broad area of device functionality — location, SMS, camera and microphone, device ID and call information, and so on. While you can drill down into each category and see which of the more fine-grained permissions an app is requesting, Google clearly intends that most users will just look at the top-level permissions groups and make their decision on that basis.
And that's the problem.
"Once you've allowed an app to access a permissions group, the app may use any of the individual permissions that are part of that group. You won't need to manually approve individual permissions updates that belong to a permissions group you've already accepted," Google writes.
So great. It's now possible for an app's permissions to change during an automatic update without the user being informed.
Let's say an app asks for your coarse network-based location data when first installed, and you install it on that basis. Later it could up that, to access your fine-grained GPS-based location, without you being told.
Or an app might initially ask for permission to receive SMS messages, but later up that to being able to edit and send them invisibly.
They're just two quick examples. Please don your tinfoil hat and spend a few moments thinking up more ways to cause mischief by upping an app's permissions after the fact.
How about this idea. You start by creating some stupid novelty app. Something like the embarrassingly stupid Yo, which does nothing more than send the word "Yo" to other people with the app installed. Somehow Yo has already scored $1.2 million in investment. You ride the viral wave, then wait for people to get bored and forget about it. But the app is still there, running some background communication process. It gradually ups its permissions in successive automatic updates. Six months later, you've got an instant mobile botnet.
Google says it has systems to scan apps and "evaluate some of the permission requests that were previously displayed in the primary permissions screen, flagging and removing apps with potentially harmful code", however. So presumably they work perfectly and this scenario is just my fantasy, right?
Now of course you could always turn off automatic updates, and manually review every app's permissions at every update. But if the whole point of this change was truly to make things simpler — and I have no reason to doubt Google's word — then it's failed. The alleged simplicity require you to give up informed consent.
Google's problem is that they're looking at app permissions from the app developers' point of view, in term of how things work under the hood, not that of the users and their interactions with the world.
There's a permissions group called "Identity", for example, which contains access to user accounts and their contact cards. But over in the "Device ID & call information" group, there's access to device IDs (such as EMIE) and phone number. I'd put money on the average user considering them to be identity information too, given that smart devices are personal devices, no matter where they sit in the operating system architecture.
Which brings me to one of my pet gripes.
This entire model for app permissions is useless. It's based on an old-fashioned systems administrators' view of file systems, not on the data and its uses.
This model assumes, for instance, that being able to read data from a device's storage is less dangerous that writing to storage. But from a privacy standpoint, it's often the other way around.
I don't mind a social camera app taking a picture now, when I press the button, and then sharing it to the network and writing it to storage. But I'd rather it didn't rummage through the other photos on my phone. I don't mind an app accepting an inbound message and adding an entry to my calendar, but it has no business looking at what's already there, and certainly no business exfiltrating that data over the network. And I don't mind an app looking up the phone number of a specific contact when I want to message them, but let's not do a Snapchat and steal my entire address book, OK?
There's also a problem with the all-or-nothing approach to app permission requests. If I want to use just some of the app's functionality, I still have to give it everything it wants — or do without.
Apple's iOS lets apps ask me for specific permissions as they're needed. Can this app have access to your contacts? Can it know your rough location? I answer as I wish, and developers who are used to writing in this environment are encouraged to provide a graceful degradation of functionality if I refuse.
Android's all-or-nothing approach might have been suitable in the industrial age, when the only way to scale up was to make everything identical. But it's time for this dull approach and the privacy-dull applications it produces to go.
"Google continues to actively look for new ways to improve how permissions work for users," the company writes. Excellent idea. How about talking to some users outside the confines of the app developer miniverse, find out how they think about privacy, do a bit of privacy engineering, and work that back into Android?