For software developers and designers, "there is a tendency, often subconscious, to be so enamored of the challenge of automating activities and tasks that the human element is ignored. Or, perhaps worse, the human element is used as the escape hatch -- automate whatever can be automated and leave the rest to people."
This is a point raised by Don Norman, founder and director of the Design Lab at the University of California, San Diego and former Apple and HP executive, in an article published in Research-Technology Management.
We develop and design systems with the intention of wringing out human judgement - but we may be losing too much human judgement in the process. Norman's words of caution are also encapsulated in a recent YouTube video, in which he states that "we want to design technology to be a collaborator, a team worker, with people. And yet, we still think that people are somehow deficient, and we have to replace them with machines."
One of the early criticisms of large-scale enterprise resource planning systems was that it forced people to abandon their own processes or ways of working and forced them into the ERP system's mold, for better or for worse. That challenge continues to this day, now with the rise of artificial intelligence, machine learning, robotic process automation, and preset cloud services.
There is also the ongoing assumption by many executives that purchasing expensive systems and dropping them into their enterprises will suddenly convert creaky, calcified organizations into lean, mean, agile operations. Before automation can work its magic, managers and employees need to be part of a forward-looking organization that encourages innovation, autonomy, learning, and open exchanges of ideas.
Everyone has been working hard in recent years to automate as many systems and processes as possible, but simultaneously sabotaging the whole effort. That's because the intent of design has been to amplify human work, while at the same time trying to write humans out of the process. That's counterproductive, Norman asserts.
Norman says what many systems developers and designers aim to eliminate "biases," meant in terms of expectations, perceptions and approaches to accomplishing tasks. Such biases are simply "the way we work." For example, he states, "we have a recency bias -- we tend to be biased by what happened to us most recently. That makes good sense, actually. The most recent events are the ones most likely to be repeated or to impact what is happening."
Instead of trying to wring out all this human judgment, developers and designers should endeavor to amplify it, and build systems around that. Design technology "to take over "the parts of a task that people are bad at. Let people decide upon high-level goals and constraints," Norman states. "Rely on machines to supplement the truly astonishing capabilities of humans." If human judgement is swayed too far by recent or unique events, the system can then intervene and say, "here's a suggestion," or "have you considered this other possibility," Norman says. "Human judgments and human decision-making, which most of the time, is extremely powerful, and most of the time, is correct."
He urges that developers and designers endeavor to strike a balance between machine efficiency and human creativity:
"Machines process information very quickly, never get bored, and reliably do the things they are designed to do. People excel at tasks requiring the exercise of creativity, a response to unexpected situations, or general attentiveness to the entire surrounding environment. A truly powerful automation approach takes these different strengths into account to create a superior, collaborative system."