Tools and Weapons, book review: Tech companies, governments and smart regulation
Technology is increasingly behind both global crises and breakthroughs, obliging the companies that create it to take responsibility for security, privacy and the impact of social networks and AI. That needs to go hand-in-hand with intelligent regulation says Microsoft.
Why are technology companies talking so much these days about political and social issues like climate change, immigration and LGBTQIA rights? It's not just because millennial employees expect that from their employers, although it's clear they increasingly want the companies they work for to act on their concerns about the world as well as their own working conditions. It's because, as Microsoft president and chief legal officer Brad Smith puts in his new book Tools and Weapons: The Promise and the Peril of the Digital Age, technology changes the world. "When your technology changes the world, you have a responsibility to help address the world that you have helped create."
For Microsoft, it's not just a sense of responsibility for the impact its technology has on the world. It's also the belief that the lessons it learned from the DoJ and EU investigations and the consent decrees that followed have something to teach everyone involved in creating technology: that neither government nor technology vendors can address the impact of technology alone and that the future of regulation needs to involve both.
The tension between government -- often in the form of law enforcement -- and the technology industry runs right through the book. Technology, the authors (Brad Smith and Carol Ann Browne) point out, is where the balance of privacy and safety come to a head, and they jump straight in the deep end by starting with the Snowden revelations about PRISM and tapping digital communications worldwide. Privacy isn't dying a quiet death, they say: data is more like air than oil (it's everywhere) but it's also like nuclear waste -- the Cambridge Analytica scandal is termed the Three Mile Island of privacy.
Microsoft's strong backing for GDPR turns out to be extremely pragmatic. The company had long wanted to have a unified information architecture but couldn't get engineers in different divisions to agree: GDPR forced that, and Microsoft was keen to see Canada and California align with GDPR so it wouldn't have to support multiple standards specifying very similar things in slightly different ways.
There's also a lesson in how Microsoft approaches the realpolitik of political donations: the company supported the proposition that created the California privacy law (with the caveat of wanting it not to diverge from GDPR) and sent privacy experts to help with the drafting of the eventual bill. But Microsoft also donated $150,000 to the campaign opposing it "to stay connected with the rest of the industry", justifying this on the grounds that it was a tiny fraction of the $50 million the campaign needed to raise.
Microsoft clearly sees value in staying engaged and participating rather than withdrawing, calling for regulation of technologies like facial recognition rather than voluntary boycotts by individual companies. "No one elected us...it's undemocratic to want us to police the government," Smith points out.
Chapters on rural broadband and immigration are both hopeful and disturbing. A third of America probably doesn't have broadband -- the counting is done so badly that it's impossible to be sure, but Microsoft's initiatives with local carriers to use White Spaces spectrum freed up from analogue TV are starting to bear fruit. The discussion of immigration, diversity and inclusion runs from controversial governmental decisions to coding initiatives and Microsoft's push for better investment in housing and transport in Seattle.
This book is packed with detail and intelligent argument, but this is one of the chapters where there's room for a little more self-examination. There's no discussion of the challenges Microsoft faces with diversity (despite strong and visible commitments to improving diversity, the company faces a class action suite and its first female technical fellow and corporate vice president recently left), but the authors do at least conclude that there's more work to do. And while books by technology leaders usually have a ghost writer, they're rarely acknowledged as a co-author, so it's nice to see Carol Ann Browne get the same billing as Smith.
Promises, threats and expectation
Tools and Weapons is a detailed look at both the promise and the threat of technology by someone who actually understands both: there's no airy technological-optimism, but equally no trivialisation of the downsides of technology -- whether that's yesterday's technology like the car or tomorrow's facial recognition. The title refers to the way any technology, even a broom, can be useful or dangerous, as summed up in Einstein's warning to the League of Nations:
"Technology advances, he cautioned, 'could have made human life carefree and happy if the development of the organizing power of man had been able to keep step with his technical advances'. Instead, 'the hardly bought achievements of the machine age in the hands of our generation are as dangerous as a razor in the hands of a three-year-old child'."
Microsoft's understanding of the impact of technology on society is clear in the book, but occasionally seems recently won. The responsibility of running a worldwide consumer email service is described as something that was 'cast on' Microsoft rather than clearly understood when the company decided to offer it. Similarly, blaming 'pranksters' for turning the Tay bot racist rather underestimates the culture war being waged on some online platforms.
If you're looking for a useful definition of AI and how machine learning differs from earlier approaches rather than the usual buzzwords and optimism, Tools and Weapons manages to be both precise and understandable. The book also covers the potential impact on jobs and the economy, as well as the enormous potential of open data for medical advancement. Microsoft is clear that regulation is coming here. Smith sets out the principles the company follows today and explores how they could guide regulations in what could be a blueprint for the industry and government working together. That said, the descriptions of working with government in the rest of the book may make that feel a touch optimistic.
As you'd expect from a lawyer, the occasional criticism of governments and other technology companies for dropping the ball on issues of privacy is extremely polite, as is the exquisitely subtle warning to China that accompanies a nuanced discussion of the difference in philosophies. "It can move forward without privacy protection for data within its borders, or it can strengthen economic connections with Europe with the inevitable data flows this will require. But it will become more difficult to do both."
There are countries where Microsoft will not put data centres and countries it has refused to sell facial recognition to, Smith notes, without saying which ones. There are hints that could lead the reader to wonder if either of those is China because of the deliberate decision not to offer Hotmail there due to the risk to human rights. But he also mentions Microsoft talking a California police force out of using facial recognition on photos of everyone they pull over.
There are some fascinating glimpses at events of the last decade from the inside, from failing to track Daniel Pearl's kidnappers but successfully finding the Charlie Hebdo attackers, to the internal discussions about how to handle the blog post that (incorrectly) suggested ICE was using Microsoft facial recognition, to CEO Satya Nadella deciding that the XP fixes for WannaCry needed to be free for everyone, and working with both the Obama and Trump White Houses.
The obscure history you pick up along the way is mostly American, even that history is used to illustrate international issues. But you also see Microsoft get an education in the different attitudes to privacy and security across countries. Plus there are fascinating nuggets, like the visiting professor who is also a Facebook lawyer inspiring Max Schrems to take his landmark privacy case to the EU and get the Safe Harbour agreement struck down, by saying that you can do whatever you want to in the EU because the penalty for breaking the law on privacy is so trivial that it's as if the law wasn't enforced.
Microsoft-watchers also get some details about how the company works internally (especially in the extensive footnotes), including the increasing engagement with employees who have views about company policies, whether that's on AI or political contributions. "Leading a company is becoming more like leading a university," the authors note. And while the company will engage publicly on issues, it won't be vocal on every issue: there needs to be some vital connection to Microsoft, whether that's the impact on customers and their use of technology, or on employees at work and in their wider community, or on the business and its shareholders.
Tools and Weapons paints a picture of a company that has grown up more than others in the technology industry, because of the consent decree and because of technology battles lost -- "We were on the wrong side of history," Smith says of open source. This is the lawyer whose job application to become Microsoft general counsel in 2002 was a single PowerPoint slide giving Gates and Ballmer the solution to their antitrust battles with the US government: "It's time to make peace". He expands that here with: "It takes more bravery and persistence to compromise than to keep fighting". Other companies might be on the verge of learning similar lessons themselves, but they don't have to wait to learn them the hard way.
The book finishes by advising governments to step up and be more active and assertive in regulating technology companies, as well as issuing a clarion call for those companies to get over themselves and get out of their own way by accepting their social responsibilities. The technology industry should stop worrying about over-regulation, start helping with intelligent regulation and make sure not to undermine the democratic freedoms and human rights from which it's benefited so much. Because either way, regulation is coming.