X
Tech

It's high time that app permissions were overhauled

App developers are creating a honeypot of big data and personal information due to the telemetry found in many mobile apps. It's little wonder that the NSA went after it.
Written by Chris Duckett, Contributor

Smurfs references aside, the news at the start of this week that the US National Security Agency (NSA) and the UK Government Communications Headquarters (GCHQ) possessed an arsenal of tools targeting information from leaky apps should not be surprising.

Anyone who has been involved in developing an app, especially a game, for the various social networks or mobile platforms can tell you why the intelligence agencies went after apps — telemetric data.

Successful app makers are sitting upon a veritable treasure trove of user information and behaviour.

Every button you touch, each swipe you make, and most definitely the paths taken to make an in-app purchase are recorded and scrutinised.

The list of reasons is simple enough to understand: Freemium app writers need to push their users towards purchases; design teams want to know whether their latest design is improving user experience; and coding teams would like to know how widespread an issue is across their userbase.

But in the process of collecting this telemetry, an awful lot of data that is of little immediate use to app developers is also collected, and much of this data is personal.

Why should the purveyor of a Scrabble-influenced app bother itself in knowing whether I may or may not be a 32-year-old white female residing in Sydney's eastern suburbs? Given the sheer amount of telemetry being returned from these apps, any decisions on the product's direction are going to be made empirically from the data collected, not on what the inferred tastes of a 32-year-old are.

Quite often for app makers, the allure of gaining as much information as possible on one's userbase is too strong, and most platforms make it far too easy to obtain.

Apple does the best job of app permissions by ensuring that permissions for services such as location are enabled at run time, not installation time, and that apps will not descend into a fiery pile of rumble and error dialogues should the user deny the permission escalation.

Android does a fair job of detailing what permissions an app wants, but its all-or-nothing approach to permissions, coupled with the reduced oversight when submitting apps to its Play Store, means that it is far too easy to gain access to information and systems beyond the minimum that those apps require to function properly.

And as for Facebook, so long as an app avoids sending copious amounts of notifications or timeline updates, it might as well be a personal data buffet lunch.

A tweet from Stephen Wilson, principal analyst at Constellation Research, sums up the situation nicely:

"In the furore about #NSA tapping gaming and ad network data, do people question why games and ad co.s have sooo much data to tap?"

Clearly few do, as despite repeated incidents and warnings, people continue to ignore permission dialogs and allow app developers to siphon off more personal data than they rightly should have.

With the knowledge that intelligence agencies are suddenly interested in the personal data honeypots found all over the internet, perhaps this will be the catalyst needed to break the cycle of bad habits that we find ourselves in.

It's one of those instances where people's lower threshold for governments obtaining information compared to private enterprises could actually be helpful.

App developers need to realise that they have been on an extended binge session of heavy permission use, and mobile operating system vendors and social networks need to acknowledge that they have been the enablers of this situation.

If the rhetoric from the industry giants is to hold any water, such as the US government behaving like an advanced persistent threat, then action must surely follow, and a user's right to privacy should be paramount.

Deploying HTTPS by default and encrypting datacentre links is a start, but it is a long way from the end of the task.

For users to be properly protected in an age of app-centric usage, the bar needs to be raised for an app to gain access to location services, contact details, or an address book.

Currently, these rate but a mention on a dot list of permissions presented to the user on installation, but they need to be treated for what they really are: A privilege escalation. People are rightly concerned and balk at the idea of HTML5 websites looking to access web cams, but a camera permission for a cloud storage app? Sure, why not, let's just throw in some GPS and address book access to this file uploader while we're at it.

Permissions need to be pointed out for what they are actually capable of on installation — access to your gallery means that an app could potentially be nefarious and upload your camera roll when it is running — but also the first time that a permission is invoked.

If a user hits a picture icon and an alert appears to give the app permission to the camera, then it is unlikely that the app is about to misbehave, but if you are scrolling a list of Tumblr posts and the app is wanting access to the phone's picture gallery, then chances are that something is awry.

The perfect scenario for implementing this would be to not only have the ability to give an app this sort of granulated permission, but also allow the app to have one-time use of a permission, and, should the user need it, the ability to quickly revoke any permissions given.

On Android, a user is able to decide whether they want to open an HTML link in Chrome one time and Firefox another time, but any permissions that these apps are granted last until the app is uninstalled.

Putting a little more control on the way apps behave could go a long way, and it is a chance for the industry to change itself before government-mandated regulation rolls in — Europe has already taken a look at the cookies that analytics relies upon, and the UN voted last month that online privacy rights should mirror those found offline. A couple of user data leaks, and who knows what sort of knee-jerk data protection laws could be a headed to a jurisdiction near you that impacts multi-national and startup alike.

Now that we know intelligence agencies are snooping around big data silos, there are two ways to remedy this situation. Either the vendors and app creators need to get together and work out how to address these issues, or alternatively, educate users as to the privacy impacts that their usage of apps entails.

Given what we know about user's behaviour, I hope that we start that conversation on improving user privacy soon.

Editorial standards