X
Business

Progress without parallel

The success of commodity computing threatens to disguise problems ahead
Written by Leader , Contributor

In 1981, Seymour Cray was hard at work on the Cray 2 supercomputer architecture. Larded with the most exotic hardware imaginable, it would be capable of a billion floating point operations a second — the legendary gigaflops. Also in 1981, IBM launched the PC. With its very expensive optional maths co-processor chip it could just about manage 30 kiloflops — 33 thousand times slower, at a cost of around $2,000 (£1,000), or ten thousand times cheaper than the Cray. It would have been rank insanity to compare the two.

Scratch the surface of the latest Cray supercomputer — planned to be a million times faster than its antecedent and cheaper in real terms — and you'll find a few thousand bulk-standard PC processor chips, each containing the genetic code from their long-distant ancestor. As Cray's uncertain fortunes from that day to this have shown, great engineering is important, but great economics are essential. Commodity silicon can effortlessly absorb all the best ideas and churn them out, cheap as chips. It's now rank insanity to build a high-performance computer any other way.

This will not last. Supercomputer designers are already talking about an approaching train wreck in high performance computing. Very few sorts of problem can be solved by breaking them down into tens or hundreds of thousands of parallel processes. They are important and useful problems, to be sure, and we need to solve them, but they are very far from everything we need to do.

Back in the daily grind of mundane enterprise IT, we too are moving into an era of parallel processing, with dual and quad-core designs signposting the way. We are not moving into a parallel era of software capable of making general-purpose use of such hardware, nor even a consensus that it's the right way to go.

At the highest levels of Intel's strategic thinking, there is still a huge discussion about whether the future is best fought with an army of identical cores or a complex mixture of specialised components. With either approach, new ways of thinking in software are desperately needed if we are to avoid an evolutionary dead-end — but nobody knows which way will work better, if at all, in keeping progress alive.

So enjoy the golden age of PC technology. We have astonishingly powerful computers at stupendously low cost: there is no comparison with the hardware of 25 years ago that does not inspire awe, and that's as true for supercomputers as for desktops. But be prepared for turbulence ahead — and to redefine your definition of rank insanity one more time.

Editorial standards