Let's posit that everything you do is watched -- or at least, that it could be. In one approach, probably the one that first occurs to people, you could stop doing things. In this approach, you retreat to a bunker, seal yourself off, and avoid all contact with the outside world. As this is unworkable in most cases, in Obfuscation: A User's Guide for Privacy and Protest, Finn Brunton and Helen Nissenbaum consider the alternative approach: surround yourself with so much noise that watchers cannot distinguish your signal. As the authors admit, this approach will only buy you time while your watchers develop counter-tactics. But for many purposes even a few minutes is enough.
Brunton is an assistant professor of media, culture, and communication at New York University; Nissenbaum is director of NYU's Information Law Institute and is well-known for her work on privacy. Brunton's name was new to me, but he is author of the 2013 book Spam: A Shadow History of the Internet, a title we're sorry to have missed.
Not a privacy manual
Obfuscation is a book that requires thought; it's a framework for considering how you might apply strategies to your particular situation rather than a manual for specific tools. Tor is used here as an example of obfuscation, rather than as an end in itself. The specific choices you make depend, the authors stress, on your threat model -- that is, the answers in your particular situation to questions like 'what is at risk?', 'who is watching?', 'what is their goal?', and 'what task needs to be accomplished?'.
Like any technique, obfuscation can be used for good, bad, or both at the same time. So much depends on your point of view. In one scam cited by the authors, a man planning a robbery advertised on Craigslist promising good money for maintenance work, specifying the uniform applicants should be wearing and the time they should arrive at the bank. Result: on the day of the robbery, the arriving police couldn't immediately distinguish the robber from the witnesses' description among the 14 similarly dressed men on location, giving the robber enough time to escape.
Here's an example Brunton and Nissenbaum do not cite: in 1994, the Church of Scientology used the strategy of flooding the Usenet newsgroup alt.religion.scientology with thousands approved postings, seeking to drown out the critics -- perhaps the earliest example of a strategy used frequently these days by all and sundry on Twitter, which they do discuss.
The clash over the ethics of these tactics typically come down to points of view, say the authors. If you're an outsider disapproving of a foreign government's behaviour, the obfuscation tactics used by a group protesting that government are smart and justifiable. If you're that foreign government, it's the work of terrorists. But despite what government spokespeople touting increased surveillance like to say, everyone has something to hide because, as Malkia Cyril said last month at the Computers, Freedom, and Privacy conference, "You all have lives."