As IT consultants and futurists who fear that, in the past, they have avoided difficult questions in their enthusiasm for technology, Paul Roehrig and Ben Pring are trying to distil the entire modern world into a somewhat simplistic formula: that the economic incentives for some kinds of technology are out of balance, and that's dragging everything down.
"Once cool disruptive 'tech rock stars' are being exposed as nothing more than the latest robber barons", they say. The security of cars, pacemakers and elections are all poor (although driverless technology is apparently "working very well"), while democracy, privacy and being polite to other people are all going out of fashion.
Decrying the loss of civility, blaming social media echo chambers rather than societal inequities, and talking about income inequality as if it's produced only by technology rather than socioeconomic systems, suggests that technology is somehow created outside of society rather than all-too-intimately enmeshed with it. Some interesting questions about the role of technology in society are obscured by the authors' enthusiasm for new technology like quantum computing, and the dystopian fantasies they entertain about the impact of the technology we already have.
Treating Amazon, Apple, Facebook, Google and Microsoft as if they all have the identical business model of "snorkel[ing] code from every move we make" simply because they have stock market valuations that outweigh most other companies ignores the different impacts they have, and the different issues that will need to be addressed in dealing with them.
The authors rightly point out that widely used technologies are developed in relatively few countries, which may be driving a global power shift. But there's no discussion of what it means if tech giants gain some of the powers of nation states, or how bytes might have a different impact from bullets in terms of how their influence is applied.
There's no mention of Russia or ransomware in the book at all (except for noting that Ukraine attracts an unusual level of cyberattacks), and no analysis of where the line of separation might fall between the Chinese government, whose approach Roehrig and Pring dub 'surveillance communism', and Chinese technology companies.
The usual misunderstanding of the original Luddites -- who were protesting not the machinery itself but the business models of the mill owners who refused to share the fruits of improved productivity with workers, and targeted their destruction appropriately -- actually undermines the point the authors try to make about the drivers of modern Luddism: inequality and exclusion caused by the irresponsible deployment of technology.
Cyber war & social tech addiction
Suggesting we're already engaged in a cyber war, given the current level of attacks, ransomware and nation-state hacking, would be more plausible if the authors didn't maintain that Advanced Persistent Threats (APTs) are "technologically very advanced" when they often target very basic security mistakes and long-patched vulnerabilities. Talking about how poorly security is implemented across government, industry and society isn't nearly as exciting as talking about Stuxnet and hackers in basements, but it would paint a truer picture of the issues.
Despite admitting there's "no solid causal link between tech and our aching heads yet", the authors spend a chapter calling smartphones and social media "digital fentanyl", suggesting that social technology is an addiction that's destroying a generation of children and claiming tech is changing how our minds work. Evolutionary psychology combines with nostalgia for the days when commuters were staring at newspapers rather than phones, resulting in the usual suggestions about limiting your screen time. After the last 18 months, asserting that community, faith and friendship can't be found online is as unhelpful as the latest 'technology rock stars' announcing that there's an app for mindfulness. It might also be more useful to explain how Elon Musk's Neuralink isn't actually that revolutionary compared to existing medical devices than to announce that it's the equivalent of Theranos.
In the middle of all this, there's a fictional account of a naïve and inflammatory startup that will confirm the prejudices of everyone who dislikes Facebook without ringing true to anyone with actual startup experience.
Similarly, the book ends with a poorly conceived 'debate' between the two authors about whether we shouldn't just turn this whole disturbing internet social media thing off that would get roundly ratioed if they were to perform it on social media. It may be intended to satirise the kind of inconsequential arguments often found online, because it's formatted as if it was a sequence of texts or private messages (without noting the irony), but a more comprehensive chapter would be welcome. The potted history of guns in Japan is mildly interesting, but it ends the book on a strangely flat note that makes you long for the substance of an expert explaining their field in a Twitter thread.
Manifesto, or wish-list?
What you would hope would be the meat of the book -- a manifesto for 'taming the machines -- is more of a wish-list. You'll probably skim past the actual suggestions for how to tackle the very real problems Roehrig and Pring are rightly concerned about in the introduction, unless you're used to the way executive reports put the actionable items right at the beginning. The suggestions range from sensible (legislation for data portability and audits of algorithms) to knee jerk (overriding anonymity on social media, doing away with Section 230 and creating a 'driver's licence' for getting on social media at the age of 18).
The discussion of the complex and difficult task of regulating technology is probably the most realistic part of the book. However, it's disappointing that the authors' obvious concern and desire to provoke a reaction leads them to focus more on listing the harms that technology has already created, rather than digging further into the "many types of law, policy and regulation: net neutrality, privacy, patent and IP law, taxation, data protection, industry regulation, AI ethics, labor laws, health data laws, job licensure [and] sharing economy regulation".
It might be harder to enliven these critical but "mind-numbingly dull" issues than to point out that Facebook makes a lot of money and that it's hard to stop your family accessing TikTok. But doing so would make for a more meaningful discussion about 'Taming the Machines'.