X
Innovation

The next wave of IT: Where do we go from here?

Everyone's talking about digital transformation, but what do they really mean?
Written by Simon Bisson, Contributor

One recurring theme in the conversations I've been having over the last few months is the idea of "digital transformation".

Yes, it's a buzzword, but it's an interesting one as it encapsulates a lot of different ideas.

As I hear different explanations from different companies, I've started to come to think of it as a piece of useful shorthand. Under the various definitions is a consistent theme, where it's as much about getting people to think about the current shift in the underpinnings of modern IT as it is about thinking about the business impact of these changes.

Having some form of shorthand is important, as there's a lot happening, and it makes sense to wrap things up in a way that's easier to understand. So what is this shift?

Like most of the changes in IT over the last four decades or so, it's a combination of things, building on previous generations of technology and taking advantage of the world we've already built.

The first wave of change was the arrival of the mainframe, and the beginning of the ability to automate business process and services.

It was followed by the PC and the inter-office network, giving us the tools we needed to build client-server applications that let us split applications into user interfaces, business logic, and data. That too moved us into a world where the arrival of a global network and the web browser gave birth to today's mobile computing world, where we dip into compute where ever we are, taking advantage of distributed n-tier architectures.

If you look at today's IT landscape you can quickly see where we've come from, but now it's also possible to see where we're going, and the trajectories that various parts of the industry are taking to get there.

Those trajectories are best shown in things I've written for the last few years: microservices, APIs, cloud architectures, containers, the internet of things, cross-platform mobile development, serverless computing, and the rise of low-code development techniques. Above all, though, there's one thread that ties this wave of change together: the cloud as a way of thinking about infrastructure.

Over 70 years ago, back in 1943, IBM's then-president Thomas J. Watson is said to have told an audience that there was "a world market for maybe five computers". While there's no evidence for him having actually ever said it, it's still held as one of the worst predictions of the future.

After all, aren't there computers everywhere now?

Whoever said it, they were in a world where computers were immense devices, powered by vacuum tubes and designed to solve the very largest pf problems. From that point of view, today's ubiquitous computers are just extensions of a new planetary computer, one we're building that's based around hyperscale clouds in data centres all over the world.

And if we look at the world through that new lens there are only five computers that really matter: Amazon's AWS, Google's Cloud, Microsoft's Azure, Facebook, and (probably) Baidu.

Everything else is, or very soon will be, just an extension or a reflection of those hyperscale systems into our pockets, onto our desks, and into our own private clouds.

That's where things get very interesting, as it's a place where the ways we've built software in the past start breaking down.

The tightly coupled, procedural, synchronous computing models we've been using for decades don't just stop working, where they do still work, they become inefficient.

They're also associated with a new set of endpoints, not just the familiar PCs and smartphones, but also wearable devices, wall screens, and a whole host of IoT hardware, from devices like Amazon's Echo to Apple's Watch, and to the screens in your car.

One aspect of this shift is that it no longer matters where an application is running. Thanks to virtualised userlands via containers the same code can run on a phone, on a PC, on a cloud server - and now it can also run in the network, thanks to container support in the latest core routers and switches. We've virtualized not just compute and storage, virtualized networks are at the heart of our modern clouds. User interfaces can take advantage of flexible web technologies, bringing responsive design across all our platforms.

Much of what we think of as the cloud is only a stepping stone to this future. Yes, Infrastructure as a Service is an important tool for extracting our applications from our data centers, but it's still just the n-tier model we've used for the last fifteen years or so.

You can look at announcements from many of the major software vendors and cloud providers as indicators of where we're going to be in; along with their various hires. So it's not surprising to see both Amazon and Microsoft investing in datacenter-scale operating systems, and using them as a tool to move developers from working with virtual infrastructures to orchestrated collections of containers alongside serverless compute instances.

It's an approach that makes a lot of sense; it means that cloud data centres run at utilisations that were unimaginable a few years ago, wasting as little power as possible. You only have to look at the fifth-generation systems that are being built today. Where in the past we would have built clusters at a machine or perhaps a rack level, these are being designed at a data centre level.

Put it all together and you're looking at a computing world that hides the computers. Yes, we'll have massive amounts of computational power in our pockets, and on our desks, but they'll only be a fraction of the available power. Instead we'll hand things over to ephemeral serverless compute running on those five huge global clouds, using the old IT dream of only the compute and storage we need, and only when we need it.

That's why the idea of digital transformation is so pervasive; we're in the process of restructuring the infrastructure on which we've been building our applications and our businesses, and that means that we're going to have to change the way we build code in order to take advantage of the new capabilities we're being given. Businesses will need to rethink how they use technology in order to take advantage of them, and users will expect to see benefits from these changes.

Microsoft CEO Satya Nadella began his tenure by talking about "ubiquitous computing and ambient intelligence", a complex concept that wraps much of this model into five words.

Computing everywhere and in everything requires a very different way of thinking, and one that needs to accept the assistance of the machine learning technologies we're building in the hyperscale cloud. That also means thinking in new ways about how we can use these technologies in our businesses.

It's a big step from a traditional application architecture to one that can be used to deliver the benefits of a digital transformation. But it's a step we need to take, if only because our users are already starting to live in that world. Now we need to deliver the future they're expecting.

Read more

Editorial standards