X
Tech

Why the PC of 2019 won't be that different from the PC of today (or the PC of yesteryear)

As Confucius said many moons ago, 'study the past if you would define the future.' So, to figure out what computing might be like in ten years or so, look at what computing was like ten or so years ago and look at what the changes we've seen.
Written by Adrian Kingsley-Hughes, Senior Contributing Editor

Over on Tech Broiler, my blogging buddy Jason Perlow, along with Scott Raymond, have been spending time gazing into their crystal balls and have come up with a vision of what personal computing might be like in 2019. It's a great read, and if Perlow and Raymond's vision holds true, we've got some great stuff to look forward to.

But I'm a little bit more of a realist. While I think that over the course of ten years or so, things will change dramatically, I think that Perlow and Raymond's vision of a 'Blade Runner' system is a lot further away than 2019, 2021 or even 2025.

Why? Well, as Confucius said many moons ago, 'study the past if you would define the future.' So, to figure out what computing might be like in ten years or so, look at what computing was like ten or so years ago and look at what the changes we've seen.

So, what was the PC of 2001 like? Well, not a lot different really. Sure, the PC of today is faster, has more storage, better graphics and so on, but the concept of a laptop or desktop PC (or the tablet for that matter) hasn't change that much. Even if you look at 'fringe' devices such as Apple's iMac or even the Mac mini, are still recognizable as personal computers. On/Off button, slot-loader optical drive, screen, keyboard and mouse. No hand-waving UI a-la Minority Report, no neural plugs, no mind control.

Even if you go back a couple of decades and look at systems running DOS back in the early 90s, they're more all that different to a PC today. MHz have been replaced by GHz, MBs by GBs and TBs, and flatscreens have replaced the hefty CRTs, but the concept hasn't changed that much at all. We're still using air cooling, hard drives with moving parts, optical drives, x86 architecture, keyboards and mice ...

You might argue that things have moved on more outside of the personal computer, and I'd likely agree with you. The smartphone is one example. While you could take a PC from 2011 back to 1992 and people would no doubt be impressed, if you took something like an iPhone back to 1992 you'd likely be burned at the stake for witchcraft. The smartphone represents serious miniturization (of both the hardware and software) that resulted in something quite different. Groundbreaking stuff does come along, just not all that often.

Here are my predictions about what technology in 2019 will be like:

  • Smaller, faster, more storage - Goes without saying!
  • A shift away from the desktop towards smaller, more portable devices.
  • While I like the idea of modular systems that you can plug this CPU and that GPU into, I think that systems will move towards a single-board design, much like tablets, smartphone and iMacs/Mac minis. We're already at the stage where power and performance specs matter not to the average buyer, and storage will be next.
  • As much as I'd like to see the aging x86 architecture taken out back and blown away with both barrels, there's too much market inertia for that to happen before 2019. Look at how attached we are to 16-/32-bit and similar legacy. Computing accumulates legacy, and it's too disruptive to (not to mention expensive) to get rid of.
  • Virtualized OS becoming the standard ... maybe ... it needs to happen but remember we're only looking at three of four Windows OS releases into the future ... is that enough time for such a dramatic shift?

The techie geek side of me love the vision of the 'Blade Runner' system portrayed by Perlow and Raymond, but the realist side of me recognizes that the pace of change is quite slow on the whole. Sure, if your metric is performance or capacity, then we've come a long way in ten years, but when it comes to big revolutionary, they don't come about often enough for us to be at the stage of having a 'Blade Runner' system by 2019.

What do you think?

Editorial standards