Intel's Moore's Law may ultimately meet economic limits

Intel's Moore's Law may ultimately meet economic limits

Summary: Intel should be given a round of applause for believing in manufacturing when few technology companies do, but there are economics to consider when it comes to Moore's Law.

SHARE:

Intel keeps driving Moore's Law---a theory that transistor density doubles every two years on an integrated circuit---but the company's investments in the effort may become questionable.

CNET News' Stephen Shankland noted the tug of war over Moore's Law and how Intel has repeatedly proven the naysayers wrong.

Moore's Law has long passed being mere prognostication. It's the marching order for a vast, well-funded industry with a record of overcoming naysayers' doubts. Researchers keep finding ways to maintain a tradition that two generations ago would have been science fiction: That computers will continue to get smaller even as they get more powerful.

To date, Intel has continued Moore's Law, rolled out a steady cadence of chips and carried a PC upgrade cycle on its back along with Microsoft. The issue is we're hitting a post-PC era and it's unclear whether Intel has the mobile chops. Intel should be given a round of applause for believing in manufacturing when few technology companies do, but there are economics to consider when it comes to Moore's Law.

The big question: At what point does Intel's massive research and development spending and capital expenses no longer make sense.

According to a few analysts, Intel is paying for the innovation of the chip industry and driving Moore's Law, but at some point there will be diminishing returns. Some analysts argue that Moore's Law diminishing returns are already here.

Simply put, it's unclear what the return on investment for Moore's Law will be years from now. After all, Intel will spend something slightly below $12.1 billion on capital expenditures in 2012. Intel also invested heavily in chip equipment maker ASML. Analysts noted that Intel had to invest in ASML because the tools needed to keep Moore's Law humming weren't in demand from other chip makers.

intc101512a

Piper Jaffray analyst Auguste Gus Richard handicapped Intel's recent developer forum and noted that the upcoming Haswell chip is already a has-been.

In a post-PC era, we believe CPU performance matters less and the ecosystem and differentiation matter more than ever. It appears, however, the company continue to believe things will be fine if they just build a better CPU. We believe the need for Intel to shift its business model has become obvious.

According to Richard, Intel's best move is to become a manufacturer for companies like Apple and Microsoft and buy intellectual property to supply the likes of Cisco too. Intel has been quietly buying intellectual property outside of its walls to make communications infrastructure.

The upshot is that chasing Moore's Law doesn't make as much sense now that the Wintel monopoly is eroding.

Not all analysts take the bear side of the equation.

Stifel Nicolaus analyst Kevin Cassidy said in a research note Oct. 2:

The bear case for Intel has the world firmly in a 'post-PC' era and that Moore's Law is over. We disagree with both points but importantly, we see that it would be Moore's Law that could drive the world to a 'post-PC' era. As a reminder, Moore's Law suggests the transistor density of an integrated circuit doubles approximately every 24 months. This is driven by reducing transistor sizes by 50% every 24 months. We believe process technology will increasingly become a key differentiator in the race for mobile processor dominance, which is a factor in Intel’s favor.

Topics: Hardware, Intel, Networking, Processors, PCs

Kick off your day with ZDNet's daily email newsletter. It's the freshest tech news and opinion, served hot. Get it.

Talkback

9 comments
Log in or register to join the discussion
  • Missing the point. Intel couldnt care less if Post PC comes true or not.

    As long as whatever we use has intel chips inside theyre good. Intel most certainly has the mobile chops. Their R&D investing is what's going to get them to the same marketshare in mobile that they have in PCs so yes it's a good investment. In 2014 healthy % of tablets will have intel chips in them. In 2016 theyll have moved into the phone. So far the toy phone os's and apps havent leveraged cpu. That will change and oems will see that their high phones need intel inside to compete. Intel will be best suited to get soc integration to 10nm. Be nice to get back to phones that last morea than a day on a charge.
    Johnny Vegas
    • SoC integration to 10nm

      All is good and fine, but Intel also cannot expect to extract $500 to $1000 prices for mobile CPUs, as they do for desktop CPUs. Therefore, they will need to find a way to produce cheap, high-performance and low-power CPUs in order to be considered for mobile applications, at all.

      Current Intel business model suggests fast innovation and huge investments to move to the next "level" at any cost. Sometimes, it's not enough to go to 10nm process -- you might just need better CPU architecture.
      danbi
    • Your right.

      The central problem behind the premis of the article is that "In a post-PC era, we believe CPU performance matters less and the ecosystem and differentiation matter more than ever" is a bit of a bizarre fallback to the thinking prevelent in the days of old where every other month some pundit was saying how a certain sized HD or a certain speed CPU were going to 'do the trick' for the forseeable future. After awhile it became clear that this wasnt exactly the case.

      The reason why some seemed to think that larger hard drives or faster processors were not needed was because the people who were saying that were, for some strange reason, thinking that the computer systems and software as well as capabilities in general were staying fairly static for the future. It seems; little did they know.

      If nobody thinks that the research into a 5nm chip could not pave the way for meaningfully more powerful and useful smartphones in the future, that seems to me to me to require some explanation. Sure, if Intel themselves simply refuse to ever translate that research into smartphone ready CPU's than fine, that would be an issue and it would be all bad on Intel.

      It seems to me, that aside from the direct necessity of Intel having to build chips for smartphones, that building better smaller ‘nm’ chips is an absolute necessity for building more powerful smartphones for the future. It seems that everyday someone is thinking up a new and useful way smartphones can be used for something. It seems to me to be plain foolish to suggest that no new technology is going to jump up and rear its head in the next few years that would be ideal to have tucked away into a smartphone, or tablet for that matter, that will really rely on a more powerful CPU.

      So, the bottom line is that while its obviously critical for Intel to turn their minds towards small mobile form factor computing in thinking about new CPU’s to produce in the future, the company that knows how to build that processor in the most powerful version available could easily find a very nice market for such a thing in the years to come.
      Cayble
  • At least one analyst (Kevin Cassidy) understands basic math

    Double the computing power in the same space = the same computing power in half the space. If CPU power isn't going to be a driver in the mobile space (which seems unlikely since new features need more processing power), mobility certainly will be. More power in smaller chips is always going to be a money maker and will always drive innovation for new applications. Things much smaller than phones will become "smart" as a result.
    RationalGuy
  • Post-PC, as deemed by customers or sellers?

    Who coined it, and why, and did they take EVERY factor into account? If I have to even write one possibility, then it's clear anything but EVERY factor was taken into account...

    Laptops are PCs.

    Macs are PCs.

    Cell phones are PCs.

    Just a different type of processor, with a different means to install and run programs.

    But they are still PC devices: "Personal Computer".

    Therefore, the term "Post-PC" is just as lamentable as the claim that the PCs we're typing on will go away overnight. PCs still have a time and place, though I'm not going to deny that some web-based functions and portable devices whose processors comparable to a Pentium III (which could run Flash and nobody complained back then, but whatever) do have a greater relevance that could not be achieved before.
    HypnoToad72
    • That depends on what you mean

      PCs are PCs, sure.

      Macs are PCs, too.

      Linux machines? Yup, PCs.

      I'd argue, so are the original smartphones: Symbian, PalmOS, Blackberries and such.

      Today's leader by market-share, Androids, are also PCs -- although some models are more locked down than others.

      iPhones and Windows Phones (what I now use), which probably aren't technically smartphones, aren't PCs. They are more akin to game consoles. This isn't to say they're only good for playing games (in fact, all-touchscreen devices stink for that purpose), but rather that they are based on the console model. They are not PCs. They are antipersonal: No side-loading of software, no adding my own drivers to support novel hardware, etc. Just buying stuff licensed by the maker. Those things that are missing are the things that make a PC a PC, and a smartphone more than just a smart phone (which is what iPhone and Windows phones are).

      The advent on Windows 8 marks a turning point, of sorts. The desktop environment in Windows is certainly still a PC environment. But the "Metro" environment is more like a console environment. Thus, Windows 8 machines are PCs, but Windows RT devices? I'd argue not. This is not to say which one is better or worse for most people, but it is an important distinction.

      The person who buys a tablet made by Apple, or even if it is a Windows RT-based tablet, instead of purchasing a Windows 8 PC (or Mac, or Unix box, or whatever) is now "Post-PC".

      Note that Intel makes CPUs for real (Android) smartphones now, which are PCs. But they may in the future get most of their revenue from devices running Windows, which will be increasingly "Post-PC". Indeed, one could imagine Windows 9 relegating the desktop to some backwards "compatibility mode" with a full focus on the RT environment.

      When that happens, we're all "Post-PC."

      It remains to be seen whether we'll be post Intel.

      But it is true that Apple doesn't even tell buyers how fast their CPU is. And in that kind of world, Moore's law may become economically nonviable.
      x I'm tc
  • Actually, the naysayers have been right.

    Despite your claims about naysayers being wrong - actually, the naysayers have been right.

    We've had to switch to multiple cores due to thermal problems.

    We still haven't broken the atomic limit, and it's still looming on the horizon.

    The new CPUs are only marginally faster than my Core 2 Quad. I'm not convinced this is a Moore's law exponential leap.

    And now you're talking about whether the economy can handle continued exponential progress.

    Frankly, I consider Moore's "law" to be dead right now. I'm not seeing exponential progress anymore.

    "Moore's law" is an observation. It's not a law of physics. There's nothing built into the physical universe that says it must (or even can) continue. In fact, there's pretty overwhelming evidence that it's physically impossible to continue it indefinitely. There's physical limitations that we likely can't break, and even exotic technologies like quantum computing will likely only delay the inevitable.
    CobraA1
    • Actually, nothing you wrote has to do with Moore's law

      Moore's law does not state that the density of transistors will double every two years, but only that the number of transistors on a chip will.

      Chips can also simply get larger and technically still satisfy Moore's law. This is a major reason for the drive to more cores. Spread out the processing, keep Moore's law going.

      For tasks that can be easily paralleled, the performance gains really needn't ever stop. For those that can't, we're already hitting limits.
      x I'm tc
  • Could be right

    I hate that everyone calls is a law. It is just a theory that has held up so far as the author point out mostly by intel. I don't know about you but I could always use more processing speed. It is a pain playing games and not being able to run my favorite media players in the back ground due to limits on processing speed that is affordable.
    bcclendinen@...