X
Tech

Gordon Moore: How I came up with the Law

On the fortieth anniversary of Moore's Law, the author dialled in from Hawaii to tell us exactly why it came about - and identify the person responsible for the popular misconception that it refers to computing power
Written by Matt Loney, Contributor

Intel co-founder and former chief executive Gordon Moore, who became famous for the eponymous law that describes how the number of transistors on a processor will increase, took time out on Wednesday to explain how he came up with his famous law, and what it means for the future:

"This week is the fortieth anniversary of the publication of the article that became the basis for what became known as Moore's Law. It is an article I wrote in response to request to predict what would happen in the next ten years, from 1965 to 1975.

1965 was very early days for integrated circuits. They were mostly used in military applications where there were no cost concerns. The principal themes I wanted to get across [in the article] were that integrated circuits were the route to inexpensive circuits, and this would happen because the systems would get much more complex. At the time I wrote the article, integrated circuits had 30 components in them, and we had one in the lab that had 60 components.

I looked back and saw that we had doubled the number every year. I took this and extrapolated it for ten years to say the number of component would go from 60 to 60,000 on a chip. I frankly didn't expect it to be at all precise. But it fact it turns out to be much precise than it had any good reason for being, and a colleague dubbed it Moore's law.

So in 1975 looked back at why this was happening, and I saw three reasons: First, there was higher density on the chips; second, we were making bigger chips as processing improved; and third, we were squeezing the waste space out of chips.

We had squeezed all the waste space out by 1975 so I said we could lose that factor, and said the number of components on a chip would double every couple of years instead of every year.

Amazingly we have stayed on that trend for 40 years, and we have got to the point where participants realise they have to move along at at least that rate. If you fall behind a generation your cost suffers. It is a self-fulfilling prophecy that the industry recognises it has to go that fast. I have been amazed that we've been able to keep it going so long. I never expected to predict so far in the future.

The nature of exponentials is that if you extrapolate them far enough you always get a disaster. I can see the extra two or three generations of technology likely to proceed, and expect we have another ten or 20 years before we hit a limit. But by then engineers will have budget of billions of transistors to do their designs, so there will be a lot of innovation they're able to continue with."

Q: Moore's Law is often misquoted as referring to computing power doubling every 18 months. Who is to blame for that?
A: A lot of people have applied the term to anything that increases exponentially. I think the thing about power came from an Intel fellow called David House who saw that the complexity of the circuits was doubling every two years but that they were becoming more powerful at the same time, so he figured that computing power was increasing faster. He's the one that deserves credit for that.

What are you thoughts on nanotechnology?
I'm a sceptic when it comes to ideas about nanotechnology replacing the integrated circuit. The integrated circuit technology is the result of an accumulated research and development budget of well over $100bn. Nanotechnology is a broad field with many applications but I am sceptical whether it will replace the more standard silicon technology. There is a huge difference between making one tiny transistor and connecting a billion of them to do something useful.

What do you think will happen in the next 40 years?
I think the technologies that will develop in that period will be mind-boggling. You just have to look back at how things were in the mid-60s and compare with today to get some idea. I don't think anything will be slowing down in that time period.

I re-read my 1965 article a year or so ago and in it I predicted home computers, but had no idea what they would look like. When I was chief executive at Intel I remember one engineer coming along saying we could build home computer. I said gee that's nice but what would you use it for? All he could think of was housewives putting recipes on it, which I didn't see at the time as reason enough to do it.

What is your take on computer interfaces?
I would like a much simpler interface though don't know what it would look like. The capability of computers keeps growing and the number of applications running keeps increasing. The people building the interface keep growing the complexity of that. It's not for lack of effort but the software people are losing ground

Is the race between chip makers to compete on speeds constructive?
I'm sure we have much more capable chips today because of the competition than we would have had if there had only been one company working on them all this time. Competition over all is very effective for making progress rapid.

Intel is now headed for the first time by a man without a PhD. Will that change the company?
No. I have often quipped about Andy Grove when he became a management guru that he had finally gotten over his PhD. Paul Otellini — I've been mispronouncing his name for 30 years I found out recently — has really got into the technology. You do pick up a lot through osmosis.

How close is Intel to device yields of 100 percent?
We have got amazingly close to 100 percent yields. I have a wafer in my cubicle at Intel that is labelled as Intel's first 100 percent wafer. Yields, which were a tremendous problem in the early days, is something that has become fairly tractable now.

Editorial standards