X
Tech

Developers, ask your users about data privacy

The idea that the internet generations don't care about privacy is a myth — so talk to them and find out the truth before collecting their personal data.
Written by Stilgherrian , Contributor

Earlier this month, McAfee's chief privacy officer Michelle Dennedy told me a remarkable story from Startupland that shouldn't have been remarkable at all. That contradiction points to a serious shortfall of ethics right across the startup industry. Yes, another one.

Dennedy's 17-year-old nephew Traver was doing a summer internship with a smartphone development company, and wanted to build an app to support his long-distance running. He faced the usual problems facing any garage-level startup. How would he do privacy and security? They're hard problems, and potentially expensive.

The solution was to clarify, up front, how the app would work in terms of privacy.

"We said, 'What would you like to do? What would you build if you built an app?' And he said, 'I'm a long-distance runner, and I'd like to keep in contact with my teammates, and I'd like my coach to track me.' And I said, 'OK, who don't you want to see this data?' 'Well, the other team'," Dennedy told ZDNet.

The next step was to work that into the app's engineering specifications — Unified Modelling Language (UML) diagrams, business association documentation, metadata modelling, and all that software engineering stuff.

"What we've done is architect out a system using who, what, why, when, where, and how for an app," Dennedy said. The result was a 10-page document.

"We didn't code it in two days, but it was two days and maybe a pizza as an investment, and we have something that has a data system of record. We understand the controls that need to be put into place, and have some ideas of how you control them, and some different strategies about how to keep the bad guys out and the good guys in. That's a 17-year-old kid and a pizza."

That right there, ladies and gentlemen, is an indictment on the entire industry.

An app developer asking potential users how they want the app to work shouldn't be a fresh and interesting story. It should be perfectly normal, the way things are always done.

Everyone who collects personal data always says that their users' privacy is their number one concern. But if they haven't even bothered to ask users how they want privacy to work, let alone worked those privacy issues into their software development processes, then their "concern" is a lie.

That right there, ladies and gentlemen, is the serious shortfall of ethics I'm talking about.

Dennedy, who's been working in privacy for "more years than I should admit" — much of it with young people — reckons many ideas about privacy are wrong.

"The myth of old was that people of the younger generations don't care about privacy, the middle people are just kinda busy and agnostic, and maybe the older people care, but they're just so scared to get on email they can't do anything anyway," she said.

"Kids in particular really care about data privacy. They may not call it that, and they're certainly willing to share all sorts of stuff that we adults are not really excited about them sharing. But they don't have mortgages, and they don't own property, so saying 'Mom and dad aren't home, I'm having a party' doesn't feel like a safety issue to them, it seems social. But they will never tell you who their crush is."

Somehow, Startupland seems to have forgotten that teenagers have one of the best motivations in the world for maintaining their privacy: they want to keep stuff from their parents. They also need to create a private space in which to experiment with their identity as they grow from child into adult.

Those needs haven't changed with the invention of the internet, but we're seeing the creation of new cultural norms of behaviour to reflect the new digital social environments. That takes time, but as those norms emerge, their social rules and restrictions and sharing structures can be built into the infrastructure.

Denning is co-author with family members Jonathan Fox and Thomas Finneran of The Privacy Engineer's Manifesto: Getting from Policy to Code to QA to Value, the ebook version of which is free. The story of the long-distance running app forms chapter eight.

Now Silicon Valley does tend to try solving social problems by throwing technology at them — when all you have is a hammer, as they say — but Dennedy says that "privacy engineering" isn't like that.

"Oftentimes what you find is that [privacy] is the realm of the lawyer, or the risk manager if you're lucky, or maybe the odd finance guy will wander into the cave every now and again. Then you go and you talk to the people who are slinging code, or buying services or software or techniques, or going to the cloud and dreaming up technical stuff, and they say to you, 'Kinda leave us in our cave over here, and go write your little policies, they're so cute, and then maybe at the end of it — maybe — you get to write some terms and conditions to get me out of my obligations.'"

You recognise that scenario, right? It's another of those ethical shortfalls, where the rules that society has agreed to operate by are seen as just another inconvenience to be avoided.

Privacy engineering is the process of turning various policies, from privacy laws to the needs of the business' plan for data, into something that programmers can work with — indeed. Something they'll want to work with because it's now an engineering problem. It's also something that quality assurance (QA) processes can deal with.

I've previously called for more actual engineering in Startupland, please. Given that Startupland runs on personal data, applying engineering to privacy seems a great place to start — as does finding out what users actually want.

Remember, most people don't actually think like a 20-something programmer.

Editorial standards