X
Tech

iPhone, AI and big data: Here's how Apple plans to protect your privacy

Can Apple take the 'big brother' out of big data?
Written by Steve Ranger, Global News Director
apple-iphone-6s-plus-product-12.jpg

The iPhone will provide facial recognition so users can sort pictures of different people into albums.

Image: CNET

Artificial intelligence and big data are white hot technologies but both need to analyse vast amounts of data to work effectively: now Apple is trying to see if it is possible to use both without compromising its tough stance on protecting users' privacy.

At the company's World Wide Developers' Conference in San Francisco the company announced a number of initiatives around machine learning and data analytics.

Apple said it will use a deep learning technology called long short-term memory (LSTM) to make its Quicktype keyboard able to offer more intelligent options during conversations. For example, automatically offering up information about where you are from in Maps, if the question crops up in a chat with a friend.

It is also using deep learning and computer vision to allow the iPhone to provide facial recognition so users can sort pictures of different people into albums. It's applying the same kind of technology to object and scene recognition as well -- doing 11 billion computations per photo to be able to understand what is in each image -- which can then be used to search for them later. Apple said it is also using artificial intelligence to analyse a user's photo library to cluster images by location, people or scenes into a new 'memories' tab.

Many other web services -- like those offered by Google and Facebook -- offer this sort of functionality already, but usually at the price of being able to mine your data to target more ads at you. But rather than using a cloud service to do this processing, Apple said this analysis will be done on the handset.

Craig Federighi, Apple's senior vice president of software engineering said at the event: "When it comes to performing advanced deep learning and artificial intelligence analysis of your data we're doing it on-device...keeping your personal data under your control," he said.

Apple has been very explicit about making privacy one of the defining characteristics of its business, championing the use of end-to-end encryption and fiercely resisting requests from the FBI to unlock an iPhone as part of its investigation of the shootings in San Bernadino. Federighi noted Apple's apps, including Facetime, Messages and HomeKit all use end-to-end encryption by default, which means that only the sender and the recipient (and not Apple or law enforcement) are able to read any messages.

And while Siri, Maps and News send data to Apple's servers, Federighi said "when you do searches of the internet for a route in maps or search for information in Spotlight we don't build any user profiles," he said -- unlike many other internet companies which harvest such data to sell more precisely-targeted advertising.

Apple also said at the event that is will use something called 'differential privacy' to allow it to analyse customer data for trends without being able to identify any particular individuals: for example to be able to spot trending words that need to be added to the QuickType keyboard suggestions.

Federighi said differential privacy uses techniques such as hashing, sub-sampling and noise injection "to enable this sort of crowdsourced learning while keeping the information of each individual user completely private".

Starting with iOS 10, Apple will use differential privacy "to help discover the usage patterns of a large number of users without compromising individual privacy". It said that the technology will help improve QuickType and emoji suggestions, Spotlight deep link suggestions and Lookup Hints in Notes.

Big databases can often make privacy harder to maintain because even with names or email addresses stripped out, it's still relatively easy to pinpoint data from a specific individual especially if it is combined with outside information that allows the hidden data to be reverse engineered. The way differentially private algorithms are designed means that only general trends, and not a particular individual's data, can be pulled out.

Apple's business model is based largely around selling hardware and apps, rather than advertising based on what users do with those devices, which is how Google makes its money from Android, so the focus on privacy is also a competitive advantage and one that its rival may find it harder to follow.

It's an interesting take, but there's still a lot of unanswered questions here: one of the more prosaic being whether all that additional local processing going to kill your iPhone battery life? And will anyone in the Android camp see this as a good reason to swap to Apple? If it does we could see more tech companies trying out this new approach - there are more links to academic research on differential privacy below.

Read more on differential privacy

More on LSTM

More on WWDC

Editorial standards