X
Innovation

Culture clashes on privacy; Silicon Valley, the government and you

Attitudes to commercial and government information-gathering reveal contrasting levels of trust and the existence of different privacy trade-offs.
Written by Mary Branscombe, Contributor

Try a thought experiment. A large entity is collecting a lot of information about what you like, buy, write or say (and to whom). How do you feel about that?

If it's a commercial company like Google or Facebook, the number of us using the service suggests we feel pretty happy about it.

You might say we're getting a free service in exchange, or that our behaviour is making it better — like Amazon's ability to suggest books you'll like —  or that you trust the company to look after your data. If scanning your email to show you better ads gets you a better spam filter, some people view it as a fair exchange.

But if it's the government, and what's on offer in return is allegedly better security? Very much not OK.

It's interesting how much more we trust a commercial company that builds its business on collecting your data than we trust the government.

For the record, I wrote part of this post in a check-in queue as a TSA agent tackled the threat posed to national security by the 3.5oz jar of raspberry chipotle jelly I forgot to put in checked luggage, by confiscating it. I'm not surprised that the NSA is taking traffic data from US mobile operators — not because I don't care, but because society has been heading down this route since at least 2001. And I'm personally uncomfortable with the information social networks have on me that there are things I routinely lie about, like my age and I tag myself in random photos to mess with the facial recognition algorithms.

It's interesting how much more we trust a commercial company that builds its business on collecting your data than we trust the government. Do we make different trade-offs for private and state ends? Do we feel safer giving data to private companies because the government can protect us from the consequences by regulation?

The scope of what a government can do with the information is obviously far wider and more disturbing. But corporations already control a wide range of things that used to be the province of government, with levels of regulation and oversight that have signally failed to keep the financial industry in line, for example.

It's an interesting sociological question why we trust private companies so much, but it's not the only relevant culture clash in the whole NSA/Prism situation.

I find it hard to believe in Prism as originally described — not just because the PowerPoint suggests that distributed cloud services have central servers to tap. For another thing, these are geeks and engineers we're talking about. They're not generally fans of government regulation. If they were dragooned into this kind of mass surveillance, someone would have broken ranks and leaked by now.

You can argue about who you believe, but the culture clash between the silicon valley mindset and the intelligence community (what the silicon valley counterculture used to refer to as the 'military-industrial complex') is still significant.

There was an amusing but telling example at the DefCon conference a couple of years ago when an NSA spokesperson sounded bemused by the fashion sense of his white-hat hacker colleagues at the agency. Their hair colours changed so often he didn't always recognise them and "it's a bonus if they're wearing shoes".

Yes, plenty of ex-hackers work for the security services, but Silicon Valley is full of the developers who don't. The philosophical and cultural differences, and underlying values, go a long way beyond fashion choices — as do the priorities.

If you work for the intelligence services, you don't get to talk about your work for the rest of your working life, and you don't get to publish any research on that work you may have undertaken. And depending on your security clearance, you might not get to publish your research on anything else for the rest of your working life without their approval.

If you're a fan of patents, for the benefits they bring your company or you personally (at come companies developers get a bonus for each patent they file), you can't file patents. And if you're a fan of open source, well, you certainly can't contribute to open-source projects. You lose out on the sharing and back and forth that's so much a part of developer life in silicon valley (and to a certain extent, in Redmond, Seattle and Silicon Roundabout as well).

You can argue about who you believe, but the culture clash between the silicon valley mindset and the intelligence community (what the silicon valley counterculture used to refer to as the 'military-industrial complex') is still significant.

And if you wake up in the middle of the night with an idea about how to solve the programming problem that's been nagging at you for a week? You have to get up and drive to a secure facility before you can try it out because you can't bring your work home, by which time your inspiration might be gone. Sounds trivial? For many developers, that's a deeply frustrating prospect.

Then there are the things that developers and architects care the most about. If you're designing a large cloud service, you're not thinking about making it easy for the government to mine. You might not even think about documenting your setup and interfaces until you have the service running well. But you're certainly thinking about performance. Having to scrape through the data and pass it on to the government, or setting up a direct connection that lets the government scan all that content? Most software architects would be howling in protest at the performance impact before they ever consider their stance on civil liberties: they care about shaving milliseconds off the response, and you want them to put in a backdoor that lets you go in and collect data that could randomly slow the system down at any time? No chance.

Besides, having a direct connection to the service isn't necessary. A government can always grab the information from the network operators. It might not be as efficient in terms of access, but it could be more practical and it lets you do a different kind of analysis.

If you're running a massive data gathering system, what are you trying to find out?

Do you want to read through every message, or do you want to find out who is talking to who — especially when they're using multiple channels? Certain apps tag packets, like VoIP apps marking packets for a higher quality of service. That makes it easy for me to grab. Find out who is in a phone conversation you care about and subpoena just their email messages for more details.

That's much cheaper (if accurate, the Prism budget would make it the most efficient government IT project ever) and you can get away with storing a lot less data.

It also makes the haystack in which your needle resides much smaller — and that way, NSA operatives spend much less time wading through irrelevant cat pictures. Which might be less fun but far more useful.

Editorial standards