Like the medieval Vatican chasing a holy relic, Intel has sent out a declaration to all the lands: "Deliver unto us the original magazine containing Gordon Moore's famous law, and we shall shower you with gold". It is true that unlike Intel the Holy Church didn't use eBay, but such seems to be today's preferred channel for items of religious revelation.
This is more appropriate than at first appears. Moore's Law, which celebrates its 40th birthday next week, is not the scientific axiom so many people believe it to be. Like a strange pattern on a toasted cheese sandwich, one may read into it what one wishes; like many a favourite scriptural passage, it is often misquoted. You may be forgiven for thinking it predicts a doubling of computing power every eighteen months — or two years, or whatever — but Gordon Moore described nothing of the sort.
Moore said that every year, the number of components on the most cost-effective chip would double. Some time afterwards, he amended that to every two years. He didn't say anything about the maximum possible number of components, nor how these components related to the amorphous concept of 'power'. Most importantly, he didn't question the underlying assumption that more equals better.
All the way through its life, Moore's Law has been rejigged — mostly to retrospectively justify what actually happened. It has been used as a compelling yet inaccurate attempt to give the semiconductor industry a divine right of prosperity through physics. That ignores a more underlying and genuine law: other than jigsaws, people don't buy things on the basis of the number of pieces inside. They buy on the benefit.
We are leaving the time when more transistors make a better computer, at least for most of us. We want — we need — smarter computers, but that's down to the software bods, who live in a mirror world. The old saying has it that Gordon Moore giveth and Bill Gates taketh away: more capable computers have led to slower, bulkier and buggier software than ever before.
Gordon Moore's formulation was visionary, valuable and surprisingly accurate. It was not and is not a law of nature, and pretending otherwise is a dangerous way to run a chip industry. It is also increasingly irrelevant. What we need now is an equivalent guiding insight for the software and systems we have built on Moore's foundation. That's always assuming there is such a thing: otherwise, in our increasingly complex and hard to manage IT world, we might as well pray to a sandwich.