I’ve been thinking a lot about the whole "post-PC" phenomenon, and have come to the conclusion that there’s no such thing.
We’ve always had a continuum of devices, and a similar wide range of use cases, and a continuing evolution of form factors and device types driven by changes in the underlying silicon technologies. The PC is just a stage on the road to something fundamentally more interesting and more powerful. It’s a journey that started with the punched card, and is now moving on with the tablet and the smartphone.
What’s more important than what we have in our hands and on our desks is just what we do with those devices, and how we handle workloads. It turns out that it’s actually a good idea to centralise processor-intensive applications and provide them with thin user interfaces –but then again that’s something we’ve known since the days of the IBM mainframe and CICS. Client server, n-tier, and the web are just alternate approaches to delivering the same functionality. The cloud, especially platform-as-a-service systems like Salesforce.com’s Heroku and Microsoft’s is just another step on the same road.
And what’s the destination?
Where we’re heading is a world where silicon is everywhere, giving us a massive distributed computing fabric that will stretch from almost invisible sensor motes and machine-to-machine communications networks, to smartphones and tablets, to dlaptops and desktops, and to servers and th cloud and beyond. It’s a network that will tie together personal monitoring devices to televisions, game consoles to electricity meters, bringing together all the compinents of a digital world, and putting them into the background. Yes, we’ll have devices with keyboards, touchscreens and the like, it’s just the computing fabric their part of is both everywhere and nowhere – vanished into the background.
It’s the future that’s usually thought of as ubiquitous computing, or "ubicomp".
Let me fire up the ZDNet UK time machine to take us back to Silicon Valley in the early 1990s. You might want to pick up a copy of the influential 1991 Scientific American special issue on Communications, Computers and Networks for some reading along the way…
It was in that issue I first read a work by one of computing heroes, Mark Weiser’s seminal essay The Computer For The 21st Century. Weiser was the face of Xerox PARC’s ubicomp research, trying to understand how this trend would affect how we interact with a vastly expanded computing infrastructure. It was that work that came up with three distinct interaction models, the tab, the pad and the wall.
Weiser’s vision was a simple one, tabs would be everywhere, devices that extended computing to simple clusters of sensors and basic interfaces linking the physical and the virtual. Meanwhile we’d do most of our work on simple, almost disposable screens, pads, with little computing power of their own that just tapped into the machines in the background. Larger scale interactions would take place with walls, display surfaces that let users interact using everything from gestures to pens, collaborating and sharing information. People carry on working the way they always have, just mediating it through additional layers of almost invisible technology.
(I always felt that Apple had missed a trick when it didn’t rename the iPhone the iTab as it launched the iPad - or that it didn’t call the Apple TV the iWall. And isn’t it all rather reminiscent of Microsoft’s "three screens + cloud" vision?)
Is the ubiquitous computing world post-PC? You could call it that, but the tools and technologies we’ve used all along are there waiting for us to call them up when we need them. It’s like calling today’s machines post-mainframe. The mainframes are still there, we just interact with them differently. That’s why we’re seeing changes in the way we interact with out PCs. Apple’s Lion introduces more natural gestures for the word of the pad, while Microsoft’s Kinect starts to open up a wider world of natural user interfaces that seem tailor made for walls. And for the smaller tab-like devices, there are already more M2M device communication addresses allocated than there are human beings.
Everything is changing. Just not the way we might expect it. In tomorrow’s ubiquitous computing world what we’ve thought of as computers will fade away, disappearing into that background and ending up everywhere.
It’s not so much post-PC, as post-computer. Ubicomp. You're going to hear a lot more about it soon.