The low-power struggle: Intel 'Centerton' Atom vs ARM Cortex A9

The low-power struggle: Intel 'Centerton' Atom vs ARM Cortex A9

Summary: Centerton is the key technology behind Intel's effort to bring its x86 architecture chips into low-power microservers, in an attempt to nip ARM in the bud before it establishes marketshare.

SHARE:
TOPICS: Cloud, Intel, Processors, ARM
19

If you're Goliath, what do you do about David? This question has likely preyed on Intel over the past few years as it has watched UK chip designer ARM gain ever-greater success in the processor market.

intel-atom-s1200-chip
Intel has unveiled its Atom S1200 chip. Image: Intel

One way for a tech titan to deal with a pesky start-up is to spread FUD (Fear, Uncertainty and Doubt) around the challenger's technology. Another approach would be for the incumbent to invest some of its ample resources in dominating fringe markets into which the smaller company would like to expand, shutting off its sources of future revenue.

Intel (2011 revenues: $54bn), gave details on Tuesday of its latest low-power 32nm Atom processors, the 'Centerton' S1200 series. These chips will go up against ARM (2011 revenues: $785m) in dense, low-power 'microservers' in modern datacentres.

Closing power gap with ARM

The 64-bit S1200 series range in power consumption from a thrifty 6.1W up to 8.5W, have clock rates of 1.6GHz to 2GHz, and have PCIe 2.0, along with support for enterprise features like ECC and virtualised workloads.

On paper, these chips come within a hair's breadth of current ARM Cortex A9 datacentre chips, such as those used by Calxeda in its servers. Calxeda's server nodes have a TDP (thermal design point) of 5W.

Calxeda's figures incorporate the power footprint of memory and the networking interface card (NIC), along with the processor, so the real power consumption of the processor itself is a tad lower. Intel, meanwhile, is reporting TDP for just its CPUs — no NIC and RAM included.

This illustrates how keen the company is to put out numbers that indicate its chips have a headline TDP figure that's competitive with ARM.

Intel was not able to provide directly comparable figures showing the TDP of the S1200s with networking and memory factored in.

First low-power 64-bit chips to market

However, Intel will be first to market with 64-bit-capable chips, as ARM's 64-bit designs are not expected to become available until mid-2013. Applied Micro should deliver the first 64-bit ARM chip in mid-2013, while Calxeda expects to get 64-bit ARM servers out in early 2014.

Plenty of Intel OEMs announced plans for Centerton servers on Tuesday. A full list was not available at the time of writing, but promotional Intel materials seen by ZDNet included companies such as Quanta, HP, Huawei, Dell, Wiwynn and Supermicro among the 20 'design wins' the new Atom chips have racked up.

In 2011 Intel predicted microservers could make up to 10 percent of the server market, and the company is sticking by this number.

The leading proponent of microserver technology to date has been SeaMicro, a start-up server maker that received heavy promotional support from Intel until it was bought by x86 rival AMD. Other companies have dipped their toe in the microserver waters, including HP.

But what does this mean for MY datacentre?

So, if you are contemplating a microserver based around many low-power processors, should you go with ARM or Intel?

Even though Intel's chips consume a bit more power than ARM's, they have the advantage of being based on the well-supported x86 architecture

This means that enterprise applications will work as expected and no code porting will be required.

On the other hand, if companies have built much of their software infrastructure around the LAMP (Linux, Apache HTTP Server, MySQL, PHP) stack, then little code porting will be required for their applications, which could run well on ARM-based microservers.

One benefit of using ARM chips is that if you have a massively distributed software system, such as one using the Hadoop file system and MapReduce information cruncher, then the nature of the platform means that it could cost less to run it on many low-power ARM chips than on a few power-hungry Xeon processors.

However, Intel has been in the datacentre business for decades, whereas ARM licensees are only just beginning to make a mark. Therefore it's likely that many enterprises will opt for Intel microservers first due to software support and vendor confidence.

Next year Intel will bring out another Atom chip — codename 'Avoton' — on its 22nm 'tri-gate' chip fabrication process, which should help cut power consumption further, and in 2014 it hopes to make a 14nm chip.

ARM licensees, meanwhile, will probably use existing 32nm processes from chip fabrication specialists GlobalFoundries and TSMC for next year until those companies' 22nm methods mature. For this reason it's likely that, although it may have a power edge now, ARM's advantage could fade in the next few years.

Topics: Cloud, Intel, Processors, ARM

Jack Clark

About Jack Clark

Currently a reporter for ZDNet UK, I previously worked as a technology researcher and reporter for a London-based news agency.

Kick off your day with ZDNet's daily email newsletter. It's the freshest tech news and opinion, served hot. Get it.

Talkback

19 comments
Log in or register to join the discussion
  • so essentially intel has a 6 month lead on ARM for servers

    that's great for short term, but not so much for long term. we'll have to see how Avoton turns out & what ARM comes up with for long term.
    theoilman
    • Availability in quantity

      Even if they hit their target of shipping a 64bit ARM in 2013, it likely won't be available in quantity until 2014.

      Even then it isn't clear that how the A50 (64 bit) series will stack up to this Atom in performance.

      Nor is it clear if the A50 at 2Ghz will continue to be 5W.

      The Haswell line (Avoton) is suspected to consume roughly half of the Ivy Bridge with 10% increase in performance. This may mean that that a Haswell Atom might come in with TDP of 4 watts or less than ARM.
      DevGuy_z
      • "Edisonville" not Haswell. More on next release

        Apparently Avoton will be a significant upgrade with up to 8 cores. Out of order execution and on die SATA support.

        22nm really allows them to stuff a lot.
        DevGuy_z
  • There would be no advantage to running hadoop on arm over these low power

    intel chips though. Dont mix and match whats being compared here. And dont forget in the real world where datacenters have server running real workloads intel has a huge perf advantage over arm. You'll be able to handle the same workload on many fewer of these low power intel cpu based servers than on arm ones. Or be able to support higher number of users/rps at a given sla on the same number of servers. By the time intel gets to 14nm there will be zero reason to put an arm chip in anything from a smartphone, to a tablet, to an ultrabook, to a datacenter. Intel will be way more perf/watt everywhere.
    Johnny Vegas
    • Can you imagine

      The phones and tablets in the next year or two. Win8 and WinP8 just in time.
      calfee20
    • That would kill the WP8 and WinRT devices, since, the Intel

      chips would make ARM chips unnecessary for the smartphones or tablets or anything else. With no need for ARM, Windows could become all x86, for all devices, including smartphones and tablets and laptops and desktops and game systems and even the servers. And all of them would be a lot more powerful than any "comparable" ARM devices.

      Perhaps the ARM people awakened a sleeping giant, and it ain't going to be pretty for ARM in the coming years.
      adornoe
      • Killing Win RT devices...

        There will still be a significant price difference between the ARM-based RT units and the Intel-based full Windows units. So, I don't see these chips affecting RT device sales. If you primarily want cheap, ARM is still the way to go.

        It would be really nice if the upcoming Edisonville/Avoton Atom came in faster, more efficient, more capable, AND cheaper. THAT might kill off RT. Personally, the ugly user interface killed off Windows 8 for me, either way.
        BillDem
        • You might be right to an extent, but, the idea is to put out an x86

          chip which will be competitive all around against the ARM processor.

          And then, the x86 processor will still be a lot more powerful than a "comparable" ARM processor. Which, as I stated above, would make the ARM processor redundant and unnecessary and uncompetitive.
          adornoe
          • Except

            The intel chips will likely continue to be more expensive.
            Michael Alan Goff
          • Expensive is a relative word, and, you have to consider the value that

            and Intel chip brings to a platform, when it can outperform the ARM chips, thereby allowing work to be done a lot faster, and allowing multiple applications to run at the same time, and each being able to complete its work a lot faster than any ARM processor.

            If one Intel chip can perform the work of multiple ARM chips, at prices which aren't that much higher than the ARM chips, then, it's a no-brainer that, the Intel chips would end up being a lot less expensive to operate and would be a lot more productive.

            Price isn't the total equation, and neither is the amount of power consumption.
            adornoe
  • Nobody Cares About x86

    Nobody needs the legacy architectural baggage that x86 brings to this market. And Intel cannot "dominate fringe markets into which the smaller company would like to expand", because those markets already offer lower profit margins than Intel is used to. While ARM and MIPS, on the other hand, are already quite at home there.

    That's right, don't forget that there are not one, but two architectures handily outshipping x86. To concentrate on just knocking off one will leave you open to being blindsided by the other.
    ldo17
    • ARM had its few years, but, Intel is going for the kill,

      and it will render the ARM processor unnecessary and redundant and uncompetitive.

      Plus, the big trump card for the x86 processor, is that, it will forever outperform the ARM chip. Whatever advantage ARM has, will become moot against a processor which has many times the capability.
      adornoe
  • Competition is good.

    We should all be happy, that we have competition in microproc manufacturers. Intel, than AMD and now ARM are all forcing each otehr to come out with better and cheaper solutions.
    That is only good for us, consumers.
    If I only remember selling Dell/NEC 486/33Mhz servers with 32Mb of ram and 2x80Mb SCSI HDD in 1993, than pentium Pro/128Mhz came out...
    Today, I'm carrying unbelivably (1,4Ghz ARM, 512Mb ram and 16Gb flash storage) more powerful smartphone just to receive email, take photos...
    It's almost ridiculous...
    Andrej.G.
  • Arm still has theoretical advantage

    Supporting cisc x86 instruction set costs silicon, so no matter how well intel can design and fab, an ARM can be made with lower power consumption for comparable power.

    Intel ignored the low power market, chasing ever bigger clock speeds, and then ever more cpu cores even when they had become irrelevant to normal users. The early atoms were (are) anemic even compared to Arm. They have realized that the come to the battle late and unprepared and are desperately playing catch up. The "numbers" you quote are not real, just engineering estimates, and as you also acknowledge, do not compare the same components.

    For me the jury is still out as to whether Intel can get back in the game.
    dimonic
    • Intel is already in the game, and threatening to eat ARM's lunch and

      breakfast and dinner.

      Intel chips are more powerful, and getting more energy efficient with each new processor design, and in about a few months time, they will release processors which will be "comparable" in power consumption to that of the ARM. But the Intel chips will still remain a lot more powerful than ARM chips, which means that, ARM will be in a disadvantage when it comes to productivity and multiple application performance. You can have an Intel chip being a few more times as productive as one ARM chip, and even with prices comparable to ARM chips, and power consumption relatively being the same. When that kind of performance and power consumption and size finally make the INtel chips "competitive", then, ARM will be on its way out.

      But, ARM could still go for the advantage by being the first "128 bit" processor, if they can do it. They will need some kind of gimmick to remain relevant. ;)
      adornoe
      • Okay

        So Intel is shooting for lowering their power consumption, and that's all well and great, but you seem to assume that ARM is just going to sit there.
        Michael Alan Goff
        • Arm can reduce its power consumption too, but it will still be outperformed

          by the Intel chip, which will still run circles around the ARM.

          When it will take a few ARM chips to do the work of one "Centerton", then, game over!
          adornoe
          • You are right ARM won't sit there but...

            Other than simple fab process they have only to go up (in power) as they add complexity and clock speed.
            DevGuy_z
          • Actually performance is not that far off

            Intel has a clock speed advantage. At the same clock speed the difference isn't great. In fact, some of the new A15s can beat some of the older Atoms in performance.

            Another problem that Intel has is that its atom chips don't connect as well together so right now MP and MC don't scale as well with Atom as with ARM.

            Also, performance is not that important in lightweight highly parallel applications (like web servers) and it is easier to create large number multi-processor servers with ARM than Intel. AMD also has an advantage in the interconnect which is why the Opteron is still very popular in HP computing. It still has an interconnect advantage.

            But I am not counting Intel out but recognize that Intel is being threatened. And particularly will get crunched as physics and moore's law begin to conflict. Intel could lose its manufacturing advantage. Still I am sure they will adapt.

            Nobody can sit still for long.
            DevGuy_z