2020: The visible end of exponential innovation

2020: The visible end of exponential innovation

Summary: Moore's law has had an incredible run of 50 years, but the end is in sight. It won't be physics that stops it...

SHARE:
TOPICS: Intel
42

It won't be physics that sets the limits of Moore's law, but fiscal issues — as the chip industry's platform of exponential innovation grinds to a halt after a 50-year run in the face of mounting costs around 2020.

Then it'll be up to software to take over in doubling computing power and halving costs every two or so years. Good luck with that. 

Rik Myslewski at The Register reported from the Hot Chips conference at Stanford University:

"When Moore's law ends, it will be economics that stops it, not physics," declared Robert Colwell of DARPA's Microsystems Technology Office. "Follow the money," he advised.

According to Colwell, who was Intel's chief chip architect from 1990 to 2001 and an Intel Fellow, there's absolutely no doubt that Moore's law will eventually be repealed. "Let's at least face the fact that [Moore's law] is an exponential, and there cannot be an exponential that doesn't end," he said. "You can't have it."

Colwell believes that 2020 will be the earliest date for the law's demise. "That's only seven years away," he reminded his audience. "We'll play all the little tricks that we didn't get around to" — but the halcyon days of exponential performance improvements will be at an end. And with them, Moore's law.

He raises an interesting point about the incremental improvements in chip performance following the end of Moore's law. Will people buy a chip that's 50 percent better, 20 percent better, 10 percent?

If customers won't buy smaller improvements, then chipmakers won't make the chips. Chipmakers won't invest billions of dollars in new manufacturing lines if the performance improvements are incremental.

Related story:

Topic: Intel

Kick off your day with ZDNet's daily email newsletter. It's the freshest tech news and opinion, served hot. Get it.

Talkback

42 comments
Log in or register to join the discussion
  • MOORE'S LAW WILL NEVER FAIL

    .
    Aliens will come to the rescue.
    .
    .
    .
    .
    .
    AGAIN

    .
    .
    .
    .
    fm-usa
    • But

      The aliens will be kept a secret by the Illuminati, who will claim all the innovations and roast the aliens with fire. Then all of a sudden there will be an "invention" by some guy working for some random company that turns industry on its head. And the Illuminati will get rich off of it until the 9-11 conspirators sic new aliens on them.

      /end snark.
      Jacob VanWagoner
      • It's happened before

        Conversely if the Illuminati were to collapse (stay tuned) then all of these suppressed technologies would be integrated into society quite quickly through various methods as retooling the Military-Industrial Complex.

        Moore’s Law would then seem quite antiquated.

        If all of this seems foreign or ridiculous to you then please do your best to hold your temper when disclosure comes. While those who have been waiting for this for a very long time now will rejoice – those who have been deceived may get very angry.

        The best way going forward is attempt to quell your anger and accept that a new renaissance is about to occur.
        Astringent
  • We're seeing foreshadowings of that now.

    I recently replaced my 2004-era motherboard, CPU, etc., because Socket 478 CPU's (Pentium 4) don't support PAE, which Win 8 requires. The computer ran Win 7 fine with 2.5GB of DDR. I don't do gaming or video editing and it handled pretty much everything else with no problem. All the latest ran fine.

    Something no one is talking about is FUNDAMENTAL changes that will come about once memristor, carbon nanotube and data crystal technologies becomes mainstream. We can expect the paradigm will shift to the kind of large-number-multicore processors currently seen in GPU's. 360TB on a crystal using 5-dimension storage capability? SICK! (Can you say Mxyzptlk? )

    No, they won't continue building faster processors. But they'll make up for it in the number of cores, and the compilers will simply shift much more to parallel processing. And it's especially important to realize that "Big Data" is particularly amenable to parallel processing.
    Rick_R
    • "Can you say Mxyzptlk?"

      No.
      Troy Knox
      • That's not the real question...

        The real question is, can you say it backwards?
        Nierteroth9
    • If we can ever afford them.

      "Something no one is talking about is FUNDAMENTAL changes that will come about once memristor, carbon nanotube and data crystal technologies becomes mainstream. "

      If we can ever afford them. Last I checked, nanotubes are still quite costly - we need to solve some problems with prices first.

      . . . and even then, that may only extend the death of Moore's law by a couple years. Nanotubes basically *are* wires that are only a few atoms thick. Not sure how we can keep Moore's law going indefinitely.

      "But they'll make up for it in the number of cores"

      If we can't keep up on the die shrinks, more cores starts to take up more space. Things start becoming bigger and bulkier, which most people don't want.
      CobraA1
      • .

        "If we can ever afford them. Last I checked, nanotubes are still quite costly - we need to solve some problems with prices first."

        Actually,making nano tubes is very cheap and easy. Making them into a useful product, not so much though...
        sdavidson118
    • How will you solve the heat problem?

      Chips will still get very hot with that many cores. So cooling solutions are still required, especially if you extend to large numbers of cores.

      And to your point about shifting complexity into the compiler...the language that software was written in has to support parallel. Intel already tried complex compilers before, and couldn't make them work. Remember the Itanium family of CPUs? Even though the chips themselves were slow (compared to their x86 counterparts) the big performance gain Intel promised was in the compilers. Intel claimed that software engineers would only need to tweek their code and recompile using Intel's new compilers, and you'd have performance increases.

      Well, Intel abandoned all of that to HP. And who knows where HP is with it now.
      malchore
      • I was just thinking that

        reading down the list.. or the cores get slower each to create less heat.. where is the line where you loose that "effective Speed Increase".

        Guess we could take a chapter from the super computer folks of Cray, and just dump it in a bath of liguid Nitrogen?
        Putertechn
    • "Can you say Mxyzptlk?"

      Sure. Sounds just like it's spelled.
      tiimzim@...
    • Oh, they will be making faster processors

      http://www.nature.com/nature/journal/v472/n7341/full/nature09979.html

      "Cut-off frequencies as high as 155 GHz have been obtained for the 40-nm transistors, and the cut-off frequency was found to scale as 1/(gate length)."

      Silicon cuts off at around 40GHz at any scale without using strain and SiGe tricks.
      Jacob VanWagoner
  • Looking at the wrong platform

    look at smartphones. 8 cores at the top end now. And draw less and less energy per operation.
    With falling sales, the PC is bound to attract less investment in increasing performance. The same thing happened with Workstations.
    At a given point in time, PCs eclipsed workstations.
    I think the same with happen with smartphones and tablets versus PCs.
    I'm quite amazed by what a four core 1.7GHz arm can do.
    stevey_d
    • Platform is irrelevant

      The move from workstations to PCs did not decrease the demand for faster processors, nor did it change the motive for investing into improving processing chips. Faster computers are faster computers, and there will probably always be a demand for faster computers, just as predictions about "how fast an internet we need" are always proven ludicrous. What this article is about is the idea that as the expense of creating a faster chip increases (due to the assumed limits of human technology in a nonmagical universe), our demand for faster chips might remain BUT there will also be a limit to how much we are willing to pay for increasingly small increments of improvement. Why invest billions in making improvements to a chip that are so small that people will pass up the opportunity to pay high prices for the improved chip? But chances are that even if our smartphones are replaced by a tiny little subcutaneous chips that can project images into our visual cortexes, chances are we will still complain about how long we have to wait for them to do anything.

      BTW, tablets and smarthpones cannot yet be considered a replacement of the PC so long as the PC continues to have some very real advantages over our tiny little, quite limited mobile devices.
      hmmm,
    • It's the code.

      The reason a four core ARM can present more responsive performance than a much more powerful Intel CPU is purely due to code craftsmanship, including the OS. Programmers for ARM systems like tablets are actually focusing on writing efficient code. It's the way we were forced to code back in the ancient 8088, 6809, and Z80 days. We didn't have endless RAM and CPU power. You had to write lean code or your program ran at a glacial pace. ARM is bringing craftsmanship back to programming. Real tablet OSes are far more efficient than desktop OSes.

      The Intel chips have allowed programmers to be lazy and sloppy for decades, now. They figure if you want something to run faster, you'll just buy faster hardware. As a result, Windows became a bloated pig of an OS that required massive resources just to run the OS. These days, the hardware isn't gaining speed like it once did. As a result, the performance jump isn't as great when you upgrade your hardware these days. So, there is less incentive to spend the money.

      The article is right. If consumers aren't spending the money for a slightly faster CPU, then the huge expense of creating those CPUs is a huge waste of money. Intel should focus on energy consumption versus performance. In fact, maybe it's time for Intel to license ARM technology and focus on tweaking that. WIntel bloat has drifted into a state of entropy. The future lies in efficiency.
      BillDem
      • Actually, I see the opposite happening, where the Intel chips

        will take over the work previously done by ARM chips and ARM eventually becoming irrelevant, since the Intel chips are more capable. When it comes to a device's capabilities and features, people have shown a preference for more and more of it. The only advantage that ARM chips used to have, was on speed and in lower consumption of power and cheaper prices. But, the newer Atom processors offer comparable size and power consumption, at better speeds, with more capabilities, such as being able to run a full-fledged OS like Windows or OSX, and all at comparable prices to what the ARM processors cost. If efficiency is what you think matters, then the Intel chips will be superior in all aspects.

        The ARM processor makers will have to start innovating in order to keep up, just like Intel had to start innovating in order to overcome whatever advantages used to have. The ball is now on the other court. ARM is still useful, and has its place, but, whatever advantages it used to have, are no longer there. People like their mobile devices, but, they've also demonstrated that they prefer the feature-rich devices over the bland ones. The Intel chips have the capabilities to support a more feature-rich device than an ARM processor. Checkmate!
        adornoe@...
  • bandwidth is starting

    To matter more than raw processing power. As the internet of things evolves, it will be the ability to manage and control data at bothba local and global level that will be the key metric.measuring performance in GHZ processing speeds will be like measuring an ark in cubits
    krossbow
  • it's a shift not a demise

    it's all about watts per CPU processes these days, in everything from phones to supercomputers, squeeze the same CPU performance for less power and you will win.

    Typical users simply don't need to raw processing power gains.
    Aussie_Troll
    • Consumers don't need it because...

      the software doesn't exist to take advantage of it. Look at the CPUs, RAM and storage needed to make IBM's Watson work, and it isn't even a full AI. If someone ever develops a true, general AI, it will require exponentially more power than PCs have now, and that should be where we are headed, whether the consumer is aware of it yet or not.
      nfordzdn
  • Were already at the end of everything else.

    What I'm curious about is will we get to a point where our devices are good enough? Right now I have a Samsung Galaxy S4. It's got a 5" screen. I don't want one any bigger. It's already almost too thin. Any thinner or lighter would be moot, if not annoying. The screen is already so high resolution that it is humanly impossible to see pixels even from the closest my eyes actually focus on the screen. The Super AMOLED screen already has perfect blacks and dynamic contrast etc. Any screen improvements would be pointless. It sounds hilarious to say because it's famous last words. But I'm seriously wondering what if we actually reach that point where there isn't room for improvement?

    My point is that processor speed is about the only thing left to be improved upon and there's still plenty of room for improvement in mobile. But I wonder how long that can continue to increase.

    In 7 years, I'm not sure they'll still be able to keep up with physics either. Intel is working on tooling up for 14nm processors to be produced in 2014. Haswell is being produced now at 22nm.

    Boy won't Gordon Moore's face be red.
    omnimoeish