Intel, ARM, tablets and desktop: Where is processing going next?

Intel, ARM, tablets and desktop: Where is processing going next?

Summary: Internal politics and reaching the limits of Moore's Law and Dennard Scaling are hitting Intel hard, but the bigger questions are about computing form factors, computing styles - and physics.


Continuing our look in more detail at the 2013 predictions from analyst Mark Anderson, the question of processors and Intel vs ARM is an interesting proxy for the bigger question of "what do we want a computer to be?"

Despite the successes of three generations of Core processors, Intel has plenty of problems to tackle.

"Mobile chipmakers (led by Qualcomm and ARM) are the new William and Kate" — Mark Anderson

Anderson put it bluntly and raised questions about both the business and the technology of CPUs ('CarryAlong' is his term for tablets, phablets, netbooks and other devices that are portable enough to carry and use all day):

"Intel: Long Live the King, the King is Dead. The chip royalty ladder is flipped, as Intel becomes increasingly irrelevant in the world of general computing, while CarryAlong and mobile chipmakers (led by Qualcomm and ARM) are the new William and Kate. For most observers, Intel in 2013 is a component supplier for servers. The best way out of this cul de sac: a new CEO with tech chops."

Quantum barrier

Intel is unbeatable in the laptop space today, but the combination of the popularity of tablets and the laws of physics at 19nm scale and below makes you wonder where they will be in five to 10 years' time. I've wondered if Intel started favouring operations over chip technologists for leadership when they noticed that they were hitting the wall on Moore's Law and had run through Dennard Scaling so much faster than predicted.

(Moore's Law, of course, relates to the number of transistors on a wafer and the cost of the fab to put them there, while Dennard Scaling has to do with keeping the power efficiency of those transistors as good as the previous generation.) 

The hi-K metal gates introduced in Sandy Bridge and the tri-gate derivative in Ivy Bridge are making a huge difference in today's Intel chips, but it took 10 years to get that into production. Moreover, a new material is needed to help deal with the fundamental issue of not having enough electrons to get a clean signal to switch the transistor between one and zero without quantum-level effects weirding out the transistor - and if there is such a material, it's still in the labs.

Intel needs to optimise the processes now to make enough time to do the next piece of fundamental science; however, as you hit the quantum level, that fundamental science is a lot harder.

In the meantime I see Intel deliberately continuing to cannibalise itself with Atom because that's preferable to being cannibalised by ARM - and at least uses those fabs that were so expensive when you built them a few years back when they were state of the art. I also see it missing out on LTE, still, which has to hurt its ability to compete in the smartphone market. And if rumours that Haswell chips don't launch until the middle of 2013 are true, Intel would be about six months behind on its usual tick-tock cadence of shrinking a die one year and rolling out a new architecture 12 months later.

Could Intel knock the internal politics on the head while it's at it? I don't understand why CloverTrail isn't yet available in large numbers, but the battle between the Silicon Valley and Israeli chip development teams on direction could be one reason why we don't yet have a Core chip that can support Connected Standby (CS) in Windows 8, even though the power levels should be low enough to make that work.

Connected Standby, Ultrabooks and GPUs

Look at the 17W TDP (thermal design point) of some Ivy Bridge Core models, the promised 15W TDP of Shark Bay Ultrabook processors and then forward to the 8-10W TDP of Haswell, when we might finally get on-package CMOS voltage regulators and support for S0ix active idle - and that's what you need for Connected Standby.

To be fair, it's possible that the reason we don't have Core chips with CS already is that it requires everything to go to a low power fast wake state, not just the CPU - and that's easiest to do when you control all the support components by building an SoC; System on Chip by definition means integration and less variation to deal with. (Qualcomm's huge success has been about not just building a good ARM chip but building it into an integrated platform that can be used to churn out dozens or hundreds of smartphones in a year.)

The Ultrabook programme also gives Intel a way to kick OEMs into line to make them use low-power components (maybe even screens with enough attached RAM to cache what you're looking at on screen when it's not changing fast enough for the GPU to need to be awake), although that's going to face the usual resistance from OEMs who want the choice to carry on with their race to the bottom on $200 tablets and $300 PCs.

Meanwhile, there's Intel's continuing inability to understand that producing graphics drivers that are robust enough to reliably do hardware acceleration is crucial. Notice how few Intel GPUs have been certified for hardware acceleration by Adobe or compare the hardware acceleration you can get in IE 9 and 10 with a 'real' GPU compared to Intel integrated graphics. Offload to GPU is one way to get around Moore's Law (a GPU has far more cores than even the most multi-core of CPUs and they're simplistic and only applicable to easily parallelised problems, but offloading to special purpose hardware is another way ARM systems punch above their weight).

Intel is stuck between the rock of physics, the hard place of mobile computing, the Scylla of expensive fabs and the Charybdis of business models.

Maybe that's why it's trying so hard to become a software company with services for developers. It's been adding NUI concepts like voice recognition and gestures as well as location services under the Computer Continuum umbrella to the language support and development tools it's long offered, and putting its own virtualisation layer under any operating system running on an Intel PC. And, yes, all that does put Intel into competition with an even wider range of industry players...

Topics: Intel, Mobility, Processors, Windows 8

Mary Branscombe

About Mary Branscombe

Mary Branscombe is a freelance tech journalist. Mary has been a technology writer for nearly two decades, covering everything from early versions of Windows and Office to the first smartphones, the arrival of the web and most things inbetween.

Kick off your day with ZDNet's daily email newsletter. It's the freshest tech news and opinion, served hot. Get it.


Log in or register to join the discussion
  • Laptops Will live as long as people need to type

    I think declaring the death of the laptop or even the desktop as is implied in this and many other articles is a bit premature. Sure, they will no longer be the "cool" stuff. And certainly, tablets and smartphones and maybe even wearable computers will cut into the laptop market. But laptops and desktops have one thing tablets and smartphones don't---WORKABLE KEYBOARDS.

    Thumb typing is fine for a text message or a short email, but you aren't going to write a novel or even a magazine article in any level of productive time with a touch screen keyboard that is about six inches wide. Also, you are not going to do decent graphics work on a 9 inch diagonal screen. For anything from letters to CAD to writing books or even longer blog posts, you need a keyboard you don't have to look at to use and which is a bit larger than a matchbox. Yes, you can get an external keyboard and connect by bluetooth, but for laptop users like myself, where do you sit the tablet when you are sitting in a chair or a car or just about anywhere where there isn't a table nearby and if you do sit it on a table, how are you going to see the screen clearly? Even effective voice recognition won't help, because in a crowded work environment or even at home with others around, you can't always broadcast your confidential information to the world trying to get Siri or Dragon to understand.

    Why do we always see these things in compettion with each other. It's like saying "Well, now we have hammers, we don't need saws anymore." They have different functions and can do different things. I have a laptop, a desktop and a tablet. Each has a function and a limitation. I'll read books on my tablet, but I'm not going to write one on it.
    • You are looking at this the wrong way

      I think you are looking at this the wrong way, this isn't a fight between laptops/PC and tablets/smartphones and so on.

      Just look at the Apple MacBook Air-like devices like the Chromebook which has an ARM-chip but also a full keyboard.
      • Chromebook - As surprising device

        This comment isn't an an analytical review of the article. However, I find the new ARM Chromebook one of the most surprising devices I've ever used from a number of perspectives. It's an inexpensive appliance (what these things should be for a lot of users) that does what it's supposed to do. There's nothing exceptional about it, but it works well. Comments?
    • that's form factors

      form factors and processing is related, but as I use laptops and hybrid tablets 100% of my computing time (not counting phone computing). I'm not at all calling the death of the laptop. I'm calling out a range of issues for Intel who is currently tied to laptops and desktops ;-)
  • Quantum effects will get only worse

    "Moreover, a new material is needed to help deal with the fundamental issue of not having enough electrons to get a clean signal to switch the transistor between one and zero without quantum-level effects weirding out the transistor - and if there is such a material, it's still in the labs."

    Problem for Intel is that weird quantum effects will get ONLY WORSE as the scaling go on now the barrier between gate and the substrate is only few atoms thick (10 or 15 layer i think) and at those lengths electrons start behaving like waves and they start tunneling over and it is IMPOSSIBLE to stop that it can just be delayed in last years we saw leaking losses (those associated with quantum tunneling) so high that they make up for half the total power (not TDP)
    so we can say 50 to 100 W are constantly wasted by electrons tunneling

    All we need is just a radical change in architecture right now there is a lot of potential parallelism unexpressed because modern processors are not truly parallel units they are just a bunch of cores connected whit some busses (physical cores) or just a marketing device leveraging poorly coded applications (hyper-threading is useless if application are truly parallel)

    Of course not everything can be parallelized but there are very few things that are truly sequential only the vast majority of the so called non parallelizable tasks aren't really sequential only the real problem is that today we aren't teaching new programmers how to think parallel we are just teaching them old fashioned sequential programming and how to wrap whit parallelism

    But all that changes are a HUGE threat to Intel's monopoly so none of them will appen on their behalf...
    Filippo Savi
  • Claims

    Claims of someone taking the Intel crown have been made for decades. Sort of like the year of the Linux desktop. Niether are happening...
    • the near future of consumer computing

      what are your thoughts on ubuntu (it's receiving social acceptance, game developers like EA and distributors like Steam are embracing) ... ?
    • Re: Claims of someone taking the Intel crown

      Intel is already a niche market compared to ARM. A profitable niche, but also one with no more growth prospects.
      • How is there no growth?

        You are assuming that Intil cannot possibly make chips that replace ARM in the mobile market, or that once full featured tablets/phablets could replace mobile OS tablets.

        Nothing is set in stone right now and hopefully the competition between the two chip markets drives innovations like we have not seen in a while.
        • Re: How is there no growth?

          Windows PC sales are in a death spiral. And there's no way ARM customers will pay the sorts of prices Intel is accustomed to charging.
          • but most ARM devices cost as much as PCs do right now

            Most just get subsidized through a phone contract.

            PCs are still selling, just not experiencing the growth that mobile devices are. Most people will either have a mobile device and a full power pc or 1 device that can replace both of those.

            Which do you think has a better chance of doing that?
          • Re: but most ARM devices cost as much as PCs do right now

            Really?? What decent Windows-running PC could I get for the price of my Asus Nexus 7?
  • ARM solutions losing money

    I love this discussion on whats going to win.

    You look at the ARM fabs and virtually all are loss making. (We are losing money per chip but its fine we will make it up in volume).

    This is a classic issue where ARM is only cheap while the FAB's try to buy market share, at some point several vendors will drop out of the market and the remaining vendors will be under less price pressure. The cost of ARM will go up and Intel will be seen as less expensive. Added to ARM adding complexity to their designs making them more expensive and power hungry.

    Really a very poor article with nothing new from any other article out there based on little facts and no understanding of economics.
    • so much money!

      Of course Qualcomm has grown a bigger market capitalization because they're loosing money whit their flagship product (snapdragon) because investors in sock exchanges love companies that wastes their money....

      come on get real Intel is in the middle of the deepest crisis they ever faced (not telling that they will disappear only they are not doing that good) while qualcomm and other ARM makers are growing whit 2 digit values

      the problem as said many many times is that intel architecture is a RISC architecture with a CISC translation layer on the top and ARM is a pure RISC one so Intel's can't be better they only have more money (the fact that intel is 100 times bigger that arm and that they have leading advantage in manufacturing technologies doesn't count?)
      Filippo Savi
      • Intel has a manufacturing advantage

        As long as Intel can maintain its manufacturing lead (which is admittedly debatable) it can reduce power consumption even while its processors are more complex. ARM is also increasing complexity and clock speed. So in one sense they are converging. Also Intel processors are faster so the issue really is performance per watt.
        • far from converging

          I don't tink that just because arm is increasing in complexity so it must be converging there is a major distinction in between the two architectures and that distinction is that one is a CISC and the other is RISC
          saying that the two are converging is like saying that fuel consumption of a sport car are dropping and that of fuel powered lawnmowers is raising means they are converging,
          while it can technically be true there is such a HUGE gap in between the two that the comparison doesn't make any sense....

          the other thing is that intel manufacturing lead will be a lot shorter than most think, not because of technological factors but for economical reasons the cost upgrading a fab to a successive node is HUGE and intel's sales are dropping almost constantly pc market is shrinking drastically and they are not even close to be present in mobile's market so they will not have numbers to continue as the are doing today (the delay of the 14 nm is due to that reasons)...
          Filippo Savi
        • Re: Intel has a manufacturing advantage

          But it cannot charge ARM prices and make money. And ARM customers will not pay Intel prices.
  • Excellent, thank you

    Mary, you are among ZDNet's best writers. Thanks for another great article with insightful and fair commentary.
    • what a bunch of crap.

      Everyone in this field is still saying that Moore's Law still has at least a decade left on "Silicon", I don't know where you came up with all this rubbish about Moore's Law screeching to a fault.

      Guess what? saying that there may be difficulties with maintaining Moore's Law a decade up the road has been the norm for the semiconductor industry for 40 YEARS.

      I don't know about how Intel will do in the mobile chip business(and neither do you). My best Guess is that they strike up some foundry contracts with Apple, Qualcomm, etc. and make a killing because of there 2-3 year process edge over the competition(and the premium they will undoubtedly charge).
      • A naive view.

        There is a difference between the ability to manufacture and the economic viability as well as practical gains of manufacturing.

        Performancewise, the free lunch of Moore's Law has been over for years.

        In terms of "automatic gains" of cost reduction from transistor shrinks, the gains for the first time has disappeared. Per gate cost is actually increasing:

        The transition to 450nm wafers is only a temporary bandaid to the cost problem.

        Again, there's a difference between able to do something and being able to benefit from doing it.