The low-power struggle: Intel 'Centerton' Atom vs ARM Cortex A9
Summary: Centerton is the key technology behind Intel's effort to bring its x86 architecture chips into low-power microservers, in an attempt to nip ARM in the bud before it establishes marketshare.
If you're Goliath, what do you do about David? This question has likely preyed on Intel over the past few years as it has watched UK chip designer ARM gain ever-greater success in the processor market.

One way for a tech titan to deal with a pesky start-up is to spread FUD (Fear, Uncertainty and Doubt) around the challenger's technology. Another approach would be for the incumbent to invest some of its ample resources in dominating fringe markets into which the smaller company would like to expand, shutting off its sources of future revenue.
Intel (2011 revenues: $54bn), gave details on Tuesday of its latest low-power 32nm Atom processors, the 'Centerton' S1200 series. These chips will go up against ARM (2011 revenues: $785m) in dense, low-power 'microservers' in modern datacentres.
Closing power gap with ARM
The 64-bit S1200 series range in power consumption from a thrifty 6.1W up to 8.5W, have clock rates of 1.6GHz to 2GHz, and have PCIe 2.0, along with support for enterprise features like ECC and virtualised workloads.
On paper, these chips come within a hair's breadth of current ARM Cortex A9 datacentre chips, such as those used by Calxeda in its servers. Calxeda's server nodes have a TDP (thermal design point) of 5W.
Calxeda's figures incorporate the power footprint of memory and the networking interface card (NIC), along with the processor, so the real power consumption of the processor itself is a tad lower. Intel, meanwhile, is reporting TDP for just its CPUs — no NIC and RAM included.
This illustrates how keen the company is to put out numbers that indicate its chips have a headline TDP figure that's competitive with ARM.
Intel was not able to provide directly comparable figures showing the TDP of the S1200s with networking and memory factored in.
First low-power 64-bit chips to market
However, Intel will be first to market with 64-bit-capable chips, as ARM's 64-bit designs are not expected to become available until mid-2013. Applied Micro should deliver the first 64-bit ARM chip in mid-2013, while Calxeda expects to get 64-bit ARM servers out in early 2014.
Plenty of Intel OEMs announced plans for Centerton servers on Tuesday. A full list was not available at the time of writing, but promotional Intel materials seen by ZDNet included companies such as Quanta, HP, Huawei, Dell, Wiwynn and Supermicro among the 20 'design wins' the new Atom chips have racked up.
In 2011 Intel predicted microservers could make up to 10 percent of the server market, and the company is sticking by this number.
The leading proponent of microserver technology to date has been SeaMicro, a start-up server maker that received heavy promotional support from Intel until it was bought by x86 rival AMD. Other companies have dipped their toe in the microserver waters, including HP.
But what does this mean for MY datacentre?
So, if you are contemplating a microserver based around many low-power processors, should you go with ARM or Intel?
Even though Intel's chips consume a bit more power than ARM's, they have the advantage of being based on the well-supported x86 architecture
This means that enterprise applications will work as expected and no code porting will be required.
On the other hand, if companies have built much of their software infrastructure around the LAMP (Linux, Apache HTTP Server, MySQL, PHP) stack, then little code porting will be required for their applications, which could run well on ARM-based microservers.
One benefit of using ARM chips is that if you have a massively distributed software system, such as one using the Hadoop file system and MapReduce information cruncher, then the nature of the platform means that it could cost less to run it on many low-power ARM chips than on a few power-hungry Xeon processors.
However, Intel has been in the datacentre business for decades, whereas ARM licensees are only just beginning to make a mark. Therefore it's likely that many enterprises will opt for Intel microservers first due to software support and vendor confidence.
Next year Intel will bring out another Atom chip — codename 'Avoton' — on its 22nm 'tri-gate' chip fabrication process, which should help cut power consumption further, and in 2014 it hopes to make a 14nm chip.
ARM licensees, meanwhile, will probably use existing 32nm processes from chip fabrication specialists GlobalFoundries and TSMC for next year until those companies' 22nm methods mature. For this reason it's likely that, although it may have a power edge now, ARM's advantage could fade in the next few years.
Kick off your day with ZDNet's daily email newsletter. It's the freshest tech news and opinion, served hot. Get it.

Talkback
so essentially intel has a 6 month lead on ARM for servers
Availability in quantity
Even then it isn't clear that how the A50 (64 bit) series will stack up to this Atom in performance.
Nor is it clear if the A50 at 2Ghz will continue to be 5W.
The Haswell line (Avoton) is suspected to consume roughly half of the Ivy Bridge with 10% increase in performance. This may mean that that a Haswell Atom might come in with TDP of 4 watts or less than ARM.
"Edisonville" not Haswell. More on next release
22nm really allows them to stuff a lot.
There would be no advantage to running hadoop on arm over these low power
Can you imagine
That would kill the WP8 and WinRT devices, since, the Intel
Perhaps the ARM people awakened a sleeping giant, and it ain't going to be pretty for ARM in the coming years.
Killing Win RT devices...
It would be really nice if the upcoming Edisonville/Avoton Atom came in faster, more efficient, more capable, AND cheaper. THAT might kill off RT. Personally, the ugly user interface killed off Windows 8 for me, either way.
You might be right to an extent, but, the idea is to put out an x86
And then, the x86 processor will still be a lot more powerful than a "comparable" ARM processor. Which, as I stated above, would make the ARM processor redundant and unnecessary and uncompetitive.
Except
Expensive is a relative word, and, you have to consider the value that
If one Intel chip can perform the work of multiple ARM chips, at prices which aren't that much higher than the ARM chips, then, it's a no-brainer that, the Intel chips would end up being a lot less expensive to operate and would be a lot more productive.
Price isn't the total equation, and neither is the amount of power consumption.
Nobody Cares About x86
That's right, don't forget that there are not one, but two architectures handily outshipping x86. To concentrate on just knocking off one will leave you open to being blindsided by the other.
ARM had its few years, but, Intel is going for the kill,
Plus, the big trump card for the x86 processor, is that, it will forever outperform the ARM chip. Whatever advantage ARM has, will become moot against a processor which has many times the capability.
Competition is good.
That is only good for us, consumers.
If I only remember selling Dell/NEC 486/33Mhz servers with 32Mb of ram and 2x80Mb SCSI HDD in 1993, than pentium Pro/128Mhz came out...
Today, I'm carrying unbelivably (1,4Ghz ARM, 512Mb ram and 16Gb flash storage) more powerful smartphone just to receive email, take photos...
It's almost ridiculous...
Arm still has theoretical advantage
Intel ignored the low power market, chasing ever bigger clock speeds, and then ever more cpu cores even when they had become irrelevant to normal users. The early atoms were (are) anemic even compared to Arm. They have realized that the come to the battle late and unprepared and are desperately playing catch up. The "numbers" you quote are not real, just engineering estimates, and as you also acknowledge, do not compare the same components.
For me the jury is still out as to whether Intel can get back in the game.
Intel is already in the game, and threatening to eat ARM's lunch and
Intel chips are more powerful, and getting more energy efficient with each new processor design, and in about a few months time, they will release processors which will be "comparable" in power consumption to that of the ARM. But the Intel chips will still remain a lot more powerful than ARM chips, which means that, ARM will be in a disadvantage when it comes to productivity and multiple application performance. You can have an Intel chip being a few more times as productive as one ARM chip, and even with prices comparable to ARM chips, and power consumption relatively being the same. When that kind of performance and power consumption and size finally make the INtel chips "competitive", then, ARM will be on its way out.
But, ARM could still go for the advantage by being the first "128 bit" processor, if they can do it. They will need some kind of gimmick to remain relevant. ;)
Okay
Arm can reduce its power consumption too, but it will still be outperformed
When it will take a few ARM chips to do the work of one "Centerton", then, game over!
You are right ARM won't sit there but...
Actually performance is not that far off
Another problem that Intel has is that its atom chips don't connect as well together so right now MP and MC don't scale as well with Atom as with ARM.
Also, performance is not that important in lightweight highly parallel applications (like web servers) and it is easier to create large number multi-processor servers with ARM than Intel. AMD also has an advantage in the interconnect which is why the Opteron is still very popular in HP computing. It still has an interconnect advantage.
But I am not counting Intel out but recognize that Intel is being threatened. And particularly will get crunched as physics and moore's law begin to conflict. Intel could lose its manufacturing advantage. Still I am sure they will adapt.
Nobody can sit still for long.