The quad-core processor era was ushered in with the release of quad-core processors for enthusiasts (the Intel Core 2 Extreme QX6700) and for servers and workstations (the Intel Xeon 5355) last week. Most major manufacturers plan to use Intel's chips in their latest products. Dell announced its new quad-core systems this month, including new servers and workstations. IBM has also made noises about its own quad-core offerings, as has HP with its own servers and workstations featuring the new Xeon chips.
Why is Intel making processors with four cores? We've only had dual core for a few months
Neither Intel nor anyone else can carry on making single core processors go faster, because we've hit limits in the physics. Quite small increases in speed now demand huge increases in power and thus heat. By keeping the speed down, the processor runs a lot cooler but not that much slower, and by putting multiple slower processors in the same package you can multiply the amount of data you can handle without overheating. Quad core is no technical breakthrough — the dual cores were that — but more Intel using its new cooler-running technology to get one up on AMD. There will be eight cores and more later, because there's nowhere else to go.
Are Intel's new quad-core processors a brand new design?
No. The Intel Core 2 Extreme QX6700 is two 2.66GHz Core 2 Duo E6700 chips mounted on a common substrate and wired to share and communicate over the same 1.066GHz front bus. This means the package generates twice as much heat as the Core 2 Duo — around 130 watts compared to 60-70 watts — and the two 4MB caches do not co-operate on any of the smart cache features. Much the same is true for the two Xeon 5100s in the Quad Core Intel Xeon 5300, although this is available in a range of speeds and power consumption from 2.66GHz at 120 watts down to 1.6GHz at 80 watts. Further variations will be forthcoming.
What performance increase should I expect?
Intel advertises up to 50 percent speed increase over dual-core processors, and for some applications and configurations this is achievable. Most individual applications will see no increase, or even a small decrease over the speeds available for similarly priced processors with fewer cores but a higher clock frequency. Some benchmarks and highly parallel applications may show markedly higher results, with particular benefits for graphics processing and scientific data analysis.
What current applications will benefit most?
If the underlying operating system can cope with quad cores — modern Windows, Linux and other Unix-derived OSs can — then you should see performance holding up better than before, as you run more applications. At the time of launch, five games and around 14 multimedia applications were either available or in development for explicit quad-core use, including Adobe Premiere Pro 2/3.0, Pov Ray 3.7 Beta and Cubase v4.5.
Wouldn't it be better to have four cores on a single chip?
AMD thinks so, and that's what it's planning for its quad core K8L:architecture due next year. On the plus side, tightly integrating all four cores allows for much better thermal management, co-operation on cache usage to maximise speed and minimise power consumption, and much faster core-to-core communication. On the minus, a manufacturing fault on one core of a four-core die ruins all four cores, but only two cores for a chip made from two lots of two-core dies. The same flexibility favours the two-by-two option when matching cores for speed. It also helps time to market by using existing, tested designs — which is why Intel has a quad core on the market and AMD does not. In general, working silicon is more profitable than PowerPoint presentations.
What happens next?
Both Intel and AMD agree that future processors will have more cores. They also agree that to make good use of these cores, more software has to be specifically written to use multiple execution threads, which is happening — but slowly. Another potential advantage is with virtual systems, where each virtual machine can use its own core, but again the benefits of this approach depends on the speed at which virtualisation is adopted.
The chip makers hope that with so much potential speed available to software that is able to take advantage of it, commercial pressure will drive developers to make good use of multicore before users decide there is little advantage to upgrading still further.