In 1993, Electronic Frontier Foundation co-founder John Gilmore observed, famously, that, "The Internet perceives censorship as damage, and routes around it." Now it's 2015, and the University of Maryland law professor Frank Pasquale has made an addendum: "The same could be said about privacy."
In Black Box Society: The Secret Algorithms That Control Money and Information, Pasquale explores the myriad ways each of us is turned into a pile of data, profiled, and scored by systems we're often barely aware of. The most obvious, which applies to Britons as well as Americans, is the all-too-familiar credit scoring that everyone from banks to phone companies uses when deciding whether you are worthy of their services. But, as Pasquale documents, scoring algorithms are used in many other contexts, such as deciding who should be placed on no-fly lists.
Because it's the oldest and most pervasive of what Pasquale calls "reputation systems", credit scoring is also the most regulated. Yet, he says, that has not improved them. TV ads and experience result in all of us internalizing the standards of the finance industry, which often profits more from failure than success. That is, the person who pays off their credit card bill in full every month is far less profitable than the one who keeps a running balance accruing interest. But what happens when such techniques are applied to health, giving each of us a "body score"?
Most of Black Box Society translates well beyond Pasquale's USA. Granted, in countries with national health services the connections among credit, health, and employment are slightly less sinister: outside the US a positive test for diabetes is likely to have less of a negative impact on one's credit score or resulting employability, for example. However, the material documenting the way secret service agencies scoop up and use information from commercial databases should be enough to alarm anyone anywhere. Worse, as security agencies come to rely on large data-driven businesses for data they cannot afford to collect themselves, regulators will be increasingly powerless to intervene. Just as some banks were "too big to fail", Pasquale suggests some data companies will become "too important to surveillance" for governments to take the risk of alienating them. In countries where politicians depend on large corporate donors to fund their re-election campaigns, this may be doubly true.
Pasquale takes particular aim at the secrecy surrounding all these data-driven systems. Society, he argues, is becoming less transparent and less accountable. Where vendors pitch that big data systems are more objective and less prone to prejudice than human judgment, Pasquale sees them as vectors for laundering discrimination: those who were victims of past financial prejudice, for example, may have loans with much higher rates of interest that translate into more late payments, which in turn are reflected in lower credit scores now. That's bad enough for individuals, but Pasquale argues that on a societal level it means the kind of "algorithmic failure" and subsequent public bail-outs we saw in 2008.
This book's particular focus on Silicon Valley and Wall Street, says Pasquale, is because the blurring of the traditional distinction between market and state is at the heart of the "black box society". As the UK's May 2015 election looms, keeping that distinction might be something to vote for.