No, I am not talking about a new ride at Disney World. I am talking about head scratching research from my new book, Silicon Collar. It's about how slowly technology truly evolves and then how slowly societies absorb technology. We are all excited about "digital transformation," but in reality, we still live in a pretty analog world.
First some background. For the book, I interviewed executives at more than 50 work settings across industries about how automation technologies (machine learning, robotics, wearables etc.) are transforming jobs. The practitioners I interviewed talked about using technology to improve productivity and product quality. They were pragmatic and generally optimistic. I also found a contrasting sense of pessimism, indeed panic, in the academic, analyst and economist world about machines causing 'jobless futures."
Why are so many smart people so pessimistic? Being a technology enthusiast for decades, I wondered what I might be missing. I found inspiration in something I had heard from Bill Joy, one of the co-founders of Sun Microsystems. Joy, who was once described by Fortune as the "Edison of the Internet," had this guidance: "If you cannot solve a problem, make the problem bigger. If you draw a bigger circle, you start to see several systems you can work on."
In case of my book, drawing a bigger circle meant looking at how technology has gradually rolled out over the last century, not just in the last few years. And that is where I found all kinds of counter-intuitive data about our continuing analog lifestyles. Here are a few of the examples I included in the book:
Luther Simjian, a prolific inventor, first convinced some New York City banks in 1960 to try out his Bankograph. That was the predecessor of the modern-day ATM. Six decades later and even with mobile banking taking off, who would have thought we would still have branch banking with human tellers? We still have over 90,000 bank branches across the U.S., employing over half a million tellers and other employees
The UPC code and scanner were patented in 1952. I describe in the book, the two decades it took the grocery industry to start to widely adopt it. The end result was not loss of checkout jobs. It improved inventory control and led to an explosion of SKUs - Kellogg brought out hundreds of cereal varieties, Campbell tens of soup varieties. Grocery sales grew, and jobs expanded. Even today, with self-check out kiosks in many stores, those jobs have not disappeared.
We have been predicting the death of "snail mail" for decades as email, texting, Skype, Facetime, and social media have become our preferred methods of communicating with each other. Yet, reports of the demise of the U.S. Postal Service have been decidedly premature. Unbelievably, its business has gone up - it sorts half the world's paper-based mail and packages and keeps over half a million workers employed.
Bob Brown envisioned the e-book reader in his 1930 book, The Readies. In the decades since, there have been a series of readers from PARC, Sony, and others. Recently, the Amazon Kindle has allowed the digital format to really take off. Yet, printed books (and related printing, shipping, and other jobs) still account for two-thirds of all books sold in the U.S. and an even higher percentage overseas.
With digital voice mailboxes and most of us doing our own word processing and travel arrangements, who needs secretaries or administrative assistants? Well, according to the Bureau of Labor Statistics, that category still employs nearly four million workers in the US.
Think travel agents have gone the way of the dodo? The BLS reports we still have 75,000 in the U.S. And around the globe many more continue to make a decent living in a world where over 40% of travel reservations are not made on the Internet.
You could argue the next wave of technologies will be different. Let's explore some of those next.
Machines have been taken over driving for a long time now. Chrysler first introduced cruise control to the masses in 1958 and in 1992, Mitsubishi introduced the LIDAR-based distance detection system, a building block of many of today's driverless prototypes. In 1999, Mercedes introduced Distronics assistive cruise control to the world. The DARPA-funded Grand Challenges for driverless cars were first held in 2004. In 1965, Playboy highlighted "Bye-Bye Stick Shift." Now, over 50 years later, more than half of the light cars sold outside the U.S. still have manual transmissions. Think how long it will take the world to adopt to driverless cars.
We are even more excited about machine learning. We forget in 1950 Alan Turing defined his famous test to measure a machine's ability to exhibit intelligent behavior equivalent to that of a human. In 1959, we got excited when Allen Newell and his colleagues coded the General Problem Solver. In 1968, Stanley Kubrick sent our minds into overdrive with HAL in his movie, 2001: A Space Odyssey. We applauded when IBM's Deep Blue supercomputer beat Grandmaster Garry Kasparov at chess in 1997. We were impressed in 2011 when IBM's Watson beat human champions at Jeopardy! and again in 2016 when Google's AlphaGo showed it had mastered Go, the ancient board game. Currently, we are so excited about Amazon's Echo digital assistant/home automation hub and its ability to recognize the human voice, that we are saying a machine has finally passed the Turing Test. Almost. Yann LeCun, director of AI research at Facebook, has commented, "Despite these astonishing advances, we are a long way from machines that are as intelligent as humans--or even rats. So far, we've seen only 5% of what AI can do."
Ditto with robotics. The first humanoid robot appeared in Japan in 1928. It could do simple motions like move a pen with its right hand. Today, Japan is the leading maker and consumer of robots, accounting for half of the world's production. Naturally, it has the world's largest concentration of robot engineers. Yet, these world-leading experts have tried for five years following the Tohoku earthquake to use robots to clean the radiation at the Fukushima nuclear plant. So far, all the robots sent into the reactors have failed to return.
Apple did a remarkable job selling the iPhone around the globe. That success has led us to talk about how product adoption curves are accelerating. Actually, the iPhone was a significant aberration.
One of the most innovative companies in the world with over 55,000 products, 3M keeps a close eye on new product introductions. It coined the term New Product Vitality Index (NPVI) to measure the percentage of 3M sales coming from products launched in the previous five years. In 2015, it reported its NPVI as 33.3%. Bear in mind, this is one heck of an innovative company and even a product launched five years ago is considered "new." For the average company, customer adoption of its new products is considerably slower. And even Apple will struggle to repeat the success with iPhone with other products.
Rug weavers in Turkey, Gaelic teachers in Ireland, Glass blowers in Murano, Italy, Shakespearean theater around the world, artisan wares on sites like Etsy and Novica-- artisans in the midst of automation--mankind has a long history of cherishing both. We will never get tired of the human touch, no matter how imperfect it may seem compared to the work of machines. More importantly, when man and machine collaborate for excellence, peoples line up for blocks to pay tribute.
Societies conspire to stay analog. And in many ways that's not too bad. It keeps humans employed far longer than the doomsayers predict.
Vinnie Mirchandani, founder of Deal Architect, is a former analyst (with Gartner), outsourcing executive (with PwC, now part of IBM) and entrepreneur. He has personally helped clients evaluate and negotiate technology contracts valued in excess of $10 billion and has advised many executives on innovative IT strategies.. Silicon Collar is his sixth book. email him at email@example.com and follow him on Twitter @dealarchitect.