X
Business

Predicting the next decade of tech: From the cloud to disappearing computers and the rise of robots

Making short-term decisions about technology investment is relatively easy; trying to work out how IT will develop over the next decade is much harder.
Written by Steve Ranger, Global News Director
future-it-intro-thumb

For an industry run according to logic and rationality, at least outwardly, the tech world seems to have a surprising weakness for hype and the "next big thing".

Perhaps that's because, unlike -- say -- in sales or HR, where innovation is defined by new management strategies, tech investment is very product driven. Buying a new piece of hardware or software often carries the potential for a "disruptive" breakthrough in productivity or some other essential business metric. Tech suppliers therefore have a vested interest in promoting their products as vigorously as possible: the level of spending on marketing and customer acquisition by some fast-growing tech companies would turn many consumer brands green with envy.

As a result, CIOs are tempted by an ever-changing array of tech buzzwords (cloud, wearables and the Internet of Things [IoT] are prominent in the recent crop) through which they must sift in order to find the concepts that are a good fit for their organisations, and that match their budgets, timescales and appetite for risk. Short-term decisions are relatively straightforward, but the further you look ahead, the harder it becomes to predict the winners.

Tech innovation in a one-to-three year timeframe

Despite all the temptations, the technologies that CIOs are looking at deploying in the near future are relatively uncontroversial -- pretty safe bets, in fact. According to TechRepublic's own research, top CIO investment priorities over the next three years include security, mobile, big data and cloud. Fashionable technologies like 3D printing and wearables find themselves at the bottom of the list.

A separate survey from Deloitte reported similar findings: many of the technologies that CIOs are piloting and planning to implement in the near future are ones that have been around for quite some time -- business analytics, mobile apps, social media and big data tools, for example. Augmented reality and gamification were seen as low-priority technologies.

This reflects the priorities of most CIOs, who tend to focus on reliability over disruption: in TechRepublic's research, "protecting/securing networks and data" trumps "changing business requirements" for understandably risk-wary tech chiefs.

Another major factor here is money: few CIOs have a big budget for bets on blue-skies innovation projects, even if they wanted to. (And many no doubt remember the excesses of the dotcom years, and are keen to avoid making that mistake again.)

According to the research by Deloitte, less than 10 percent of the tech budget is ring-fenced for technology innovation (and CIOs that do spend more on innovation tend to be in smaller, less conservative, companies). There's another complication in that CIOs increasingly don't control the budget dedicated to innovation, as this is handed onto other business units (such as marketing or digital) that are considered to have a more entrepreneurial outlook.

CIOs tend to blame their boss's conservative attitude to risk as the biggest constraint in making riskier IT investments for innovation and growth. Although CIOs claim to be willing to take risks with IT investments, this attitude does not appear to match up with their current project portfolios.

Another part of the problem is that it's very hard to measure the return on some of these technologies. Managers have been used to measuring the benefits of new technologies using a standard return-on-investment measure that tracks some very obvious costs -- headcount or spending on new hardware, for example. But defining the return on a social media project or an IoT trial is much more slippery.

Tech investment: A medium-term view

If CIO investment plans remain conservative and hobbled by a limited budget in the short term, you have to look a little further out to see where the next big thing in tech might come from.

One place to look is in what's probably the best-known set of predictions about the future of IT: Gartner's Hype Cycle for Emerging Technologies, which tries to assess the potential of new technologies while taking into account the expectations surrounding them.

The chart grades technologies not only by how far they are from mainstream adoption, but also on the level of hype surrounding them, and as such it demonstrates what the analysts argue is a fundamental truth: that we can't help getting excited about new technology, but that we also rapidly get turned off when we realize how hard it can be to deploy successfully. The exotically-named Peak of Inflated Expectations is commonly followed by the Trough of Disillusionment, before technologies finally make it up the Slope of Enlightenment to the Plateau of Productivity.

"It was a pattern we were seeing with pretty much all technologies -- that up-and-down of expectations, disillusionment and eventual productivity," says Jackie Fenn, vice-president and Gartner fellow, who has been working on the project since the first hype cycle was published 20 years ago, which she says is an example of the human reaction to any novelty.

"It's not really about the technologies themselves, it's about how we respond to anything new. You see it with management trends, you see it with projects. I've had people tell me it applies to their personal lives -- that pattern of the initial wave of enthusiasm, then the realisation that this is much harder than we thought, and then eventually coming to terms with what it takes to make something work."

.jpg
The 2014 Gartner Hype Cycle for Emerging Technologies. Image: Gartner

According to Gartner's 2014 list, the technologies expected to reach the Plateau of Productivity, (where they become widely adopted) within the next two years include speech recognition and in-memory analytics.

Technologies that might take two to five years until mainstream adoption include 3D scanners, NFC and cloud computing. Cloud is currently entering Gartner's trough of disillusionment, where early enthusiasm is overtaken by the grim reality of making this stuff work: "there are many signs of fatigue, rampant cloudwashing and disillusionment (for example, highly visible failures)," Gartner notes.

When you look at a 5-10-year horizon, the predictions include virtual reality, cryptocurrencies and wearable user interfaces.

Working out when the technologies will make the grade, and thus how CIOs should time their investments, seems to be the biggest challenge. Several of the technologies on Gartner's first-ever hype curve back in 1995 -- including speech recognition and virtual reality -- are still on the 2014 hype curve without making it to primetime yet.

future-it-ghc-1995
The original 1995 Hype Cycle for Emerging Technologies. Image: Gartner

These sorts of user interface technologies have taken a long time to mature, says Fenn. For example, voice recognition started to appear in very structured call centre applications, while the latest incarnation is something like Siri -- "but it's still not a completely mainstream interface," she says.

Nearly all technologies go through the same rollercoaster ride, because our response to new concepts remains the same, says Fenn. "It's an innate psychological reaction -- we get excited when there's something new. Partly it's the wiring of our brains that attracts us -- we want to keep going around the first part of the cycle where new technologies are interesting and engaging; the second half tends to be the hard work, so it's easier to get distracted."

But even if they can't escape the hype cycle, CIOs can use concepts like this to manage their own impulses: if a company's investment strategy means it's consistently adopting new technologies when they are most hyped (remember a few years back when every CEO had to blog?) then it may be time to reassess, even if the CIO peer-pressure makes it difficult.

Says Fenn: "There is that pressure, that if you're not doing it you just don't get it -- and it's a very real pressure. Look at where [new technology] adds value and if it really doesn't, then sometimes it's fine to be a later adopter and let others learn the hard lessons if it's something that's really not critical to you."

The trick, she says, is not to force-fit innovation, but to continually experiment and not always expect to be right.

Looking further out, the technologies labelled "more than 10 years" to mainstream adoption on Gartner's hype cycle are the rather sci-fi-inflected ones: holographic displays, quantum computing and human augmentation. As such, it's a surprisingly entertaining romp through the relatively near future of technology, from the rather mundane to the completely exotic. "Employers will need to weigh the value of human augmentation against the growing capabilities of robot workers, particularly as robots may involve fewer ethical and legal minefields than augmentation," notes Gartner.

Where the futurists roam

Beyond the 10-year horizon, you're very much into the realm where the tech futurists roam.

Steve Brown, a futurist at chip-maker Intel argues that three mega-trends will shape the future of computing over the next decade. "They are really simple -- it's small, big and natural," he says.

"Small" is the consequence of Moore's Law, which will continue the trend towards small, low-power devices, making the rise of wearables and the IoT more likely. "Big" refers to the ongoing growth in raw computing power, while "natural" is the process by which everyday objects are imbued with some level of computing power.

"Computing was a destination: you had to go somewhere to compute -- a room that had a giant whirring computer in it that you worshipped, and you were lucky to get in there. Then you had the era where you could carry computing with you," says Brown.

"The next era is where the computing just blends into the world around us, and once you can do that, and instrument the world, you can essentially make everything smart -- you can turn anything into a computer. Once you do that, profoundly interesting things happen," argues Brown.

With this level of computing power comes a new set of problems for executives, says Brown. The challenge for CIOs and enterprise architects is that once they can make everything smart, what do they want to use it for? "In the future you have all these big philosophical questions that you have to answer before you make a deployment," he says.

Brown envisages a world of ubiquitous processing power, where robots are able to see and understand the world around them.

"Autonomous machines are going to change everything," he claims. "The challenge for enterprise is how humans will work alongside machines -- whether that's a physical machine or an algorithm -- and what's the best way to take a task and split it into the innately human piece and the bit that can be optimized in some way by being automated."

The pace of technological development is accelerating: where we used to have a decade to make these decisions, these things are going to hit us faster and faster, argues Brown. All of which means we need to make better decisions about how to use new technology -- and will face harder questions about privacy and security.

"If we use this technology, will it make us better humans? Which means we all have to decide ahead of time what do we consider to be better humans? At the enterprise level, what do we stand for? How do we want to do business?".

Not just about the hardware and software

For many organizations there's a big stumbling block in the way of this bright future -- their own staff and their ways of working. Figuring out what to invest in may be a lot easier than persuading staff, and whole organisations, to change how they operate.

"What we really need to figure out is the relationship between humans and technology, because right now humans get technology massively wrong," says Dave Coplin, chief envisioning officer for Microsoft (a firmly tongue-in-cheek job title, he assures me).

Coplin argues that most of us tend to use new technology to do things the way we've always been doing them for years, when the point of new technology is to enable us to do things fundamentally differently. The concept of productivity is a classic example: "We've got to pick apart what productivity means. Unfortunately most people think process is productivity -- the better I can do the processes, the more productive I am. That leads us to focus on the wrong point, because actually productivity is about leading to better outcomes." Three-quarters of workers think a productive day in the office is clearing their inbox, he notes.

Developing a better relationship with technology is necessary because of the huge changes ahead, argues Coplin: "What happens when technology starts to disappear into the background; what happens when every surface has the capability to have contextual information displayed on it based on what's happening around it, and who is looking at it? This is the kind of world we're heading into -- a world of predictive data that will throw up all sorts of ethical issues. If we don't get the humans ready for that change we'll never be able to make the most of it."

Nicola Millard, a futurologist at telecoms giant BT, echoes these ideas, arguing that CIOs have to consider not just changes to the technology ahead of them, but also changes to the workers: a longer working life requires workplace technologies that appeal to new recruits as well as staff into their 70s and older. It also means rethinking the workplace: "The open-plan office is a distraction machine," she says -- but can you be innovative in a grey cubicle? Workers using tablets might prefer "perch points" to desks, those using gesture control may need more space. Even the role of the manager itself may change -- becoming less about traditional command and control, and more about being a "party host", finding the right mix of skills to get the job done.

In the longer term, not only will the technology change profoundly, but the workers and managers themselves will also need to upgrade their thinking.

Further reading

Editorial standards