In praise of inefficient technology and creative mistakes

In praise of inefficient technology and creative mistakes

Summary: When you automate a system designed around messy human beings, the resulting efficiency can make everyone unhappy


Automation makes things faster, cheaper and more efficient; that's great if you want the same thing every time and if you want to free up people from the boring bits (creating invoices, grinding out components, testing code for bugs) for the creative parts (working out a new business idea, designing the car the components go in, coming up with the idea for what the code will do).

Automation was the basis of the industrial revolution and the way technology has transformed business. But when inhuman efficiency appears in systems that have grown up around messy, inefficient humans, the side effects are often painful.

That's why technology and privacy are so often opposed. If I say something in the local bar or on the train, people might overhear — but that doesn't stop people having very personal conversations there.

I've been unable not to hear about people's illnesses, relationship issues or job interviews. Sit in the coffee shop of a hotel where a conference is going on and you can almost guarantee that you can listen in on a few interviews.

It's not usually seen as a privacy issue because it's not very efficient. As long as you look around to make sure your boss isn't behind you at the bar while you're chatting away about your plan to replace him, you don't usually have to worry about what you say coming back to bite you.

Even someone eavesdropping and typing it into Twitter doesn't mean your conversation will necessarily reach anyone you'd be embarrassed to have know about it.

Humans aren't efficient enough to hear everything, or remember it perfectly. But technology is. Whether it's an ill-considered comment about your job on Facebook or a picture that ends up on a revenge site or full-scale NSA-scale surveillance, when automated efficiency hits human behaviour there are unintended consequences.

The same is true in capitalism, as Bill Janeaway points out in his excellent book Doing Capitalism in the Innovation Economy. When you make a company or a process efficient, not all of the waste you take out is not actual, no-use-to-anyone Keynesian waste; some of it is Schumpeterian waste, the side effects of which are useful developments like the internet and Teflon frying pans and Ethernet.

Those came out of the 'wasteful' investments at ARPA and NASA and Xerox Parc. 3M didn't set out to create Post-It notes; an inventor had a glue that didn't work very well and came up with another use for it.

An overwhelming proportion of startups fail, but along the way enough of them act like tiny R&D teams that we get new developments. At the other end of the scale are research labs like MSR and HP Labs, which look ten and fifteen years ahead and appear to go in hundreds of different directions, many of which don't end up at products.

You wouldn't have called IBM's chess-playing computers a sensible investment, but Deep Blue was part of the path to Watson and machine learning APIs that any developer could use (because not every developer is going to be able to work out how to do good machine learning algorithms on their own).

A perfectly efficient system is sterile and perfect and implacable. It doesn't allow space for creative mistakes and waste that turns out to be useful or the funny ways people feel and react. We're spending a lot of time making things efficient and not that much time thinking about what that means for the humans involved.

Further reading

Topics: Privacy, Emerging Tech

Mary Branscombe

About Mary Branscombe

Mary Branscombe is a freelance tech journalist. Mary has been a technology writer for nearly two decades, covering everything from early versions of Windows and Office to the first smartphones, the arrival of the web and most things inbetween.

Kick off your day with ZDNet's daily email newsletter. It's the freshest tech news and opinion, served hot. Get it.


Log in or register to join the discussion
  • Well spoken, madam

    The best meditation on the issue of privacy and computing that I've seen in a while.
    John L. Ries
  • PARC is an altogether different story

    Most of what PARC worked on ended up as a home run for someone at some point, sometimes Xerox, sometimes others.

    There's the famous things, like Laser printers and the GUI. But there's also ethernet, and page description languages, the ancestor of PDF and XPS files.... John Warnock helped develop InterPress, a PDL similar to PostScript and the granddaddy of PDF. He eventually went off and founded Adobe in order to maximize the value of the insights he gained at PARC.
    • Ethernet was designed for laser printers

      you needed it to move the big pages to print fast enough. basic research then, as it always does, goes on to enable far wider innovations.
  • Taylorism run amok

    The religion of efficiency comes from Frederick Taylor who tried to eliminate all wasted motion from assembly lines. His failure was not understanding that a certain amount of "inefficiency" is good for many different reasons. Micro efficiency sometimes, paradoxically, often can hurt macro efficiency.
    • I suppose we should thank Mr. Taylor...

      ...for inspiring the modern workers' compensation system, which provides a disincentive to increasing efficiency in ways that increase the risk of injury to workers.

      But yes, it's very easy to take efficiency too far. And scientific approaches to management often fail to ask the right questions, resulting in plausible, but wrong answers.
      John L. Ries
  • Efficiency And the Long View

    Don't usually go together. If something isn't paying off NOW, it should be abandoned, seems to be the philosophy, and today's hyper-reactive always-connected culture only facilitates that more - people react to information as soon as they have it, and companies try to accommodate that.

    Say what you will about transparency, but when customers know a lot about what's going on, they try to exert influence on processes they have - at best - an extremely shaky understanding of. And we let them, in the name of immediate results.
    luke mayson