X
Innovation

Dual core - a lesson from history?

We all know that dual-core computing has been a smashing success, because Intel and friends delight in telling us so. Indeed, lots and lots of dual core processors have been sold.
Written by Rupert Goodwins, Contributor

We all know that dual-core computing has been a smashing success, because Intel and friends delight in telling us so. Indeed, lots and lots of dual core processors have been sold.

However, I doubt many people sat down and decided that a dual-core chip best met their needs. They just bought a new computer. The laptop I'm writing this on is dual-core, but apart from having two CPU usage trails in the system meter you'd be hard pressed to tell.

Thing is, for most of the history of desktop computing, the performance of the processor was a limiting factor. Go back far enough, to the time of the 8-bit processors, and there were plenty of times when there was a huge price differential between, say a 4MHz Z80A and the next serious step up like a 16 MHz 68000. It made a lot of sense to put two cheap processors in a single design - the equivalent of two cores - instead of taking the hit of the bigger chip.

Almost to a man, those designs failed to thrive. There were multi-processor systems that had some success, but they tended to be specialist or multi-user boxes (I have fond memories of an HM Systems Minstrel that ran development stuff for the curious crowd of programmers at Amstrad. All S100 slave processors, TurboDOS and PCW8256 terminals. What larks). Even the designs that had two different processors to run a choice of OS were hard-pushed to show an advantage -- DEC Rainbow, anyone? No, didn't think so.

I don't think anyone has ever bothered to make a triple- or quad-processor general purpose PC, until now.

The reason was, of course, that the software to actually use such configurations efficiently didn't exist, and there was no way to confer any advantage on programs that only expected to see a single processor. On the desktop, that hasn't changed very much between then and now. (The examples that get quoted to me most often, transcoding and other media manipulation, are more susceptible to greater intelligence in content origination and display than by offloading translation to a point in the distribution chain. It's certainly far more economical that way, and basic economics do have a tendency to make themselves felt).

Which means, I think, that the whole dual-triple-quad-many core story will fade away, and become as marginal an issue as how many integer arithmetic units there are in any particular PC. (I think there are around ten in my laptop, but I don't know enough about the graphic chip architecture to say for sure). We'll have to evolve a new way to talk about the goodness of processors: I suspect it'll be something to do with simultaneous thread capacity, but the market's open for ideas around data or instruction bandwidth.

Anything that can actually be related to actual advantages to the user. Core count, like CPU count, ain't it.

Editorial standards