X
Home & Office

Will speed kill the next tech revolution?

It took decades to develop and popularise our now-familiar desktop user interfaces. Do we still have the patience to develop the next interface revolution?
Written by Stilgherrian , Contributor

The idea of a constantly accelerating pace of change isn't exactly new. Theorists were talking about it in the 1930s, but the phenomenon began in the industrial revolution, or even earlier. Now, in the second decade of the 21st century, the idea is so deeply rooted in our culture that we take it for granted.

Accelerating change is also reflected in the way we build new technology, and the companies that deliver that technology. The 1970s waterfall model, with software release cycles measured in months or years, is being replaced by the 21st-century agile methodology, with cycles of just a few weeks.

Most tech companies still have three-year financial plans, because the bean counters demand them, but the executives are really working to quarterly good-news reporting targets. Increasingly, they're responding to data from real-time dashboards and a stock market that's being hit with high-frequency trading algorithms that buy and sell shares in seconds.

Given all this, it's no surprise that hackfests have become so popular. In just two or three days, often with no sleep but plenty of stimulants, ideas are formulated, prototyped, and judged — sometimes by potential investors.

And need I mention that in the tech media, we journalists and commentators are expected to report on and analyse developments within hours or even just minutes? Or, even worse, as a live blog? What does this say about the quality of the "considered analysis" you read?

Now, none of this is news. But I wanted to emphasise how pervasive this faster-faster attitude is in our industry — because I think it has the potential to stifle truly revolutionary innovation, rather than the everyday incremental kind.

Google's Larry Page was on the right track when he and co-founder Sergey Brin took back control of the company in 2011, aiming to reintroduce a longer-term focus and break the habit of giving Wall Street a high-dollar treat every quarter.

"We have always tried to concentrate on the long term, and to place bets on technology we believe will have a significant impact over time," Page told investors in his 2012 report.

"It's hard to imagine now, but when we started Google, most people thought search was a solved problem and that there was no money to be made apart from some banner advertising. We felt the exact opposite: That search quality was very poor, and that awesome user experiences would clearly make money."

The same happened when Google launched Chrome in 2008. Who needed yet another web browser? When Gmail was launched in 2004, people thought webmail was a toy. Google's purchase of YouTube in 2006 was greeted with scepticism, Page said.

Google isn't the only company that manages to look ahead a few years, of course, but this isn't the place to start making a list. But I think Google and companies like it are still only half right.

Truly revolutionary technology doesn't take years to develop. It takes decades.

Earlier this month, computing pioneer Alan Kay spoke to Time about his original early-1970s vision for personal computing, Project Dynabook. He also discussed the work of the Xerox Palo Alto Research Center that created the familiar Mac and PC desktop we see today — bitmapped graphics, overlapping windows, drop-down menus, icons, file folders, mouse control, and so on — as well as Ethernet networking.

We've been using this technology for more than two decades, since Apple's Macintosh in 1984 and Microsoft's Windows 3.1 in 1992, but it took a long time to get there.

"It took 12+ years of funding to create personal computing and pervasive networking, and this only happened because there was a wise and good funder (ARPA-IPTO). If we include commercialisation, this took a little more than 20 years (from 1962 to 1984, when the Mac appeared). It's important to realise that no one knew how difficult a problem this was, but it was seen as doable and the funder hung in there," Kay said.

And even then, Kay's Dynabook vision has never been fulfilled. "Dynabook [was intended] to be able to simulate all existing media in an editable/authorable form in a highly portable networked (including wireless) form ... For all media, the original intent was 'symmetric authoring and consuming'. Isn't it crystal clear that this last and most important service is quite lacking in today's computing for the general public?"

Kay believes the next-generation interfaces will include "helpful agents" running on sophisticated artificial intelligence (AI). A mega-Siri, if you like. "AI is a difficult problem, but solvable in important ways ... It's likely that 'good AI' is a 15- to 20-year problem at this point. But the only way to find out is to set up a national effort and hang in there with top people."

But will we ever see that kind of effort in the age of the hackfest and quarterly earnings calls?

Editorial standards