Has Moore's Law finally hit the wall?

Has Moore's Law finally hit the wall?

Summary: Used to be you could buy a new computer every 3 years and get 2x the performance. Not anymore. Performance has hit a wall. What's this mean for you?

SHARE:

Used to be you could buy a new computer every 3 years and get 2x the performance. Not anymore. Performance has hit a wall - or at least a steep hill. What's this mean for the industry?

Moore's law Moore's Law says that the number of transistors on a chip will double every 18 to 24 months. But Moore's Law has been simplified to mean a doubling of performance every 18 to 24 months.

Not anymore.

Transistors ≠ performance. Yes, clock speeds have improved from the 1 MHz 6502 processor in the original Apple II to over 3 GHz today. But clock speeds have leveled out: in a third of a nanosecond light moves about 4 inches or 10 cm - and electricity is slower than light.

Another big piece of improved performance has come from wider data paths. Chips now move data in 64 and 128-bit chunks, rather than the 6502's 8-bit bytes. Not much more growth there, either.

We've thrown transistors at performance issues: more and wider registers; bigger caches; deeper pipelines; intelligent branch prediction; smarter I/O management; and thousands of other enhancements.

More RAM? We've also been adding ever-larger on-chip caches that improve performance. SSDs further improve I/O through lower latency, an area we're still learning about.

Multicore We can't make processors go faster. We can't process more data per clock cycle. So how do we put twice as many transistors to work?

Stuffing more processors on a chip. And right now many of the brightest minds in computer science are struggling with the problem of getting usable work out of 8, 12 or 16 core CPUs.

Dual and quad core processors work pretty well because multitasking runs a lot of background threads. Those threads can use multiple cores and improve performance.

But outside video, image, voice and scientific apps, most personal apps - don't need multicore architectures. Humans aren't good multi-taskers.

The wall We've hit a wall. We can still double the number of transistors. We can still double disk drive capacity. We can build faster interconnects, such as QuickPath, Light Peak and 10 Gb Ethernet. And SSDs also help performance.

But the easy wins are over. Going forward performance gains will be measured in single digit percents each year.

Implications Information technology is driven by consumers, not the enterprise. What happens when a new PC is only 20% faster than your fully paid for three-year-old PC?

If it is a notebook, it can be smaller, lighter, more stylish and more rugged. But the desktop?

Future model differentiation will have to move on. Here's where:

  • Power. The server space is making greater power efficiency a differentiator. Mobile's been pushing this for 15 years. You'll see more.
  • Integration. Open up in iPad or a MacBook Air and you see a tiny PC board, a few chips and a huge set of batteries. Battery life makes products convenient.
  • Functionality. Integrating multiple applications, each with their own dedicated core, may enable consumer devices to collapse multistep workflows into a single devices. Capture, voice-edit, compress and upload video from a single candy bar sized device?
  • Cost. The first low-res digital cameras cost hundreds of dollars and now they're almost free. Huge market among the billions who live on less than $2500 a year.

The Storage Bits take Moore's Law driven market growth isn't over. We can use our still growing technical capabilities to refine what we already do.

But the days of newer=faster are over. It's newer=better: less power; smaller; cheaper; and - in cases like SSDs - overall system performance will improve too.

The good news for storage is that data production will continue to grow. Always on, always available consumer data systems will create ever more demand for storage.

In the enterprise this will affect storage architectures as well. When you can't scale up, you have to scale out. Decomposable storage architectures will come to the fore.

Comments welcome, of course. The Apple ]['s motherboard style was the same as today's MacBook Air: a few chips on a PC board. Friends were always startled to see empty my Apple ]['s case was.

Topics: Storage, CXO, Hardware, Processors, IT Employment

Kick off your day with ZDNet's daily email newsletter. It's the freshest tech news and opinion, served hot. Get it.

Talkback

49 comments
Log in or register to join the discussion
  • RE: Moore's Wall

    Interesting points, will definitely remember this next time i upgrade, which might be farther away now than normal!
    Gavello
  • The wall for electronics in general...

    is the use of electrons. Eventually this wall will be removed when someone finally discovers the key to move us from the age of electronics into the age of photonics.
    jasonp@...
    • I have worked on photonics for over 20 years

      @jasonp@...

      Digital computing requires bi-static devices to make and store ones and zeros. Devices or junctions with two stable states are inherently non-linear like most semi-conductor materials. The big problem with photonics and optical photons is that these give extremely linear physics. Non-linear optics exist, but take tremendous instantaneous power levels to be manifest. Also, you cannot store a photon (zero mass boson particle). It has to move to exist.

      The use of photonics for computing requires a wholly different approach to what computing means. Many specialized optical "computing" architectures have been proposed and built (I was involved with 3 different architectures) but none of these would ever be "programmed" by a general purpose computing language.

      The next step in computing will be a quantum jump (no pun intended) over our current notions regarding what a "computer" is or does.
      jacarter3
      • Technology isn't there yet to store photons...

        but the keyword there is "yet". It was literally last month that scientists were able to figure out and execute the storage of antimatter (antihydrogen atoms, specifically). Current technology still only allows for "storage" of antihydrogen atoms for 1/10th of a second before they are destroyed, but prior to this breakthrough the lifespan of these atoms was measured in single-digit microseconds. I believe that if scientists can figure out how to store particles whose existence a decade ago was purely theoretical, they'll be able to figure out how to store zero mass boson particles. If there's on constant in science, it's that we're always proving false ideas that have been assumed true and proving true ideas that started off as science fiction musings. I certainly don't assume that future "computing" architectures would even be recognizable as such today or that they would utilize general computing languages we're familiar with. I also don't assume that they wouldn't. What I do assume is that huge breakthroughs in technology are in our future. If the theory of singularity bears out, the middle of this century could be a very interesting time indeed for the scientific community.
        jasonp@...
      • RE: Moore's Wall

        @jacarter3 "Digital computing requires bi-static devices to make and store ones and zeros. "

        Actually, it doesn't. There's nothing that says a digital computer can't function with multiple states.There's nothing that strictly requires these states to be 'stable' either...statistically based systems are conceivable.

        Second, we don't know HOW to store photons, but 'storing them' may not be necessary. We store LOTS of photonic information without storing the photons, merely the information about their states. Magnetic storage doesn't store electrons., it stores an energy state induced in another substance.

        And merely because they must move to exist doesn't mean they can't be stored--storage can be dynamic. Even stored electrons don't stop moving!

        "The next step in computing will be a quantum jump (no pun intended) over our current notions regarding what a "computer" is or does."

        Probably true for 'the majority of people," but then, the majority of people have no real concept of what a computer 'does' anyway. Most of the world, to most people is a set of black boxes which do certain things when you do certain things to them.

        A GP computer is like no other device we've created--it's uses are open-ended.

        Basically, it manipulates information. Data storage is another concept, which, while related, is not strictly part of a computer.

        While the technology required to continue the acceleration we've experienced may well require a major leap and a change in technology, the end result may be invisible to the users. There's a thousand ways to implement a NAND gate, but the electronic digital computer doesn't care which you use.
        wizoddg
      • Storing photons amd "statistically based systems"

        @jacarter3 <br><br>Maybe in Star Trek Next Generation, they can store photons but quantum physics provides no way to store energy quanta in a photon state. You can always convert that energy but you lose the photon. Photons can be slowed to very low velocities but the devices required to do that feat are the size of a small mobile home, use more power than your entire neighborhood and don't scale to nanometer dimensions needed used by silicon transistors.<br><br>Every system is statistically based unless you can achieve absolute zero thermal energy (absolute 0K). In fact the the state of a transistor is statistically based due to random thermally induced electron transitions in the semiconductor lattice energy bands. This appears as noise that must be over come to reduce errors. <br><br>What is an error? First off, we must have some sort of information that we are trying to store or process. Basically, some sort of decision making device must decide the state of the system to infer that information. Noise gives random processes that give rise to errors in that decision making process. This is true of bi-state or multi-state systems. It's our ability to infer information from the system in spite of these these statistics that provides the means to process information in any sort of computation device, including a "GP"" computer.<br><br>Back to photons, there are no photon only based physical processes that can make "decisions." A decision is inherently a non-linear process and as <br>I have said, the nondestructive interaction of photons (with energy quanta greater than thermal state transitions) with matter are very very linear. Further, photons do not interact with photons. They do not collide, combine or interact in any way. We only observe mutli-photon interactions when we destroy them by colliding or absorbing them using mass based matter and generally all collision that affect the photon will destroy it. A new photon may be emitted but the original is lost. Multiple photons do not interact with the same atom either. In all cases, any decision relying on photonic based information requires detecting it which in turns requires destroying it.<br><br>At AT&T (what was once Bell) Labs, they experimented with optical computing and Self Electro-optical Effect Device (SEED) computing. The power required per decison was huge compared to a transistor. Further, the biggest problem was that each SEED stage could accept fan-in and fan-out numbers in the range of 2-4. Far to low to create the interconnection density required by even a silicon CPU. There ave been recent announcements from AT&T about "break throughs" but my assessment of these "advances" are the product of a researcher trying to garner excitement and further research funding.<br><br>If we ever do achieve true genralo purpose optical computing, I have no doubt that anyone reading this in 2010 will be alive to see it happen, unless you believe in Futurama.
        jacarter3
    • I couldn't agree moore .. i mean more

      @jasonp@... and i truly believe you are the only person replying on this blog that has grasped what the <i>Moore's Wall plateau</i> really means to x86 and x64 system architectures. <br><br><i>"The wall for electronics in general is the use of electrons. Eventually this wall will be removed when someone finally discovers the key to move us from the age of electronics into the age of photonics. "</i><br><br>I was thinking along the same lines - except that i can't see any major vendors continuing with existing hardware form factors. If, indeed, the 'critical mass' is close to being reached for existing chip conceptualization and design architectures, this can only mean:<br><br>(1) x86 and x64 based, system design - and the desktop (as we know it) - is heading the way of the Dodo. (..that or increase m/b's to the size of tennis courts.)<br><br>(2) Mobile architectures, *are* indeed the logical progression / evolution for computing (.. so as much as i hate to say it .. seems like great foresight at Cupertino on taking a gamble)<br><br>(3) Moore's Law slogan changed to <i>"Less is Moore"</i>?? <br><br><img border="0" src="http://www.cnet.com/i/mb/emoticons/wink.gif" alt="wink">
      thx-1138_
  • RE: Moore's Wall

    Jasonp is right for the most part. The other way to do this is through quantum mechanics and quantum leaps and differentials. Quantum computing is "in the works" and we'll see if that or something else becomes the new standard.
    hoaxoner
  • Shift in focus.

    While what you write is true, the industry as a whole has shifted it's focus to smaller, more efficient and cheaper. There was always a need for faster faster faster especially in the consumer space which is not really true anymore. As an example, Microsoft missed the refocus with Vista, and Windows 7 reversed course to be more efficient. Intel and AMD are focused on power and efficiency as the world looks for smaller and better battery life. Where it comes to gaming, the high end can already render 3D FPS is super high resolution. On the supercomputing front, just throw more CPUs into the mix.

    So, I believe, the industry simply started Moore's law in another direction, shrining existing technology into smaller, lighter and more efficient forms. With even the weakest desktop today good for gaming and the most modest laptops today able to handle pretty much anything a user wants to do, heck, even smartphones offering a complete computer experience, the need to be uber fast is not what it once was. Let's face it, hardware finally caught up with the incredibly inefficient (and easy, powerful) programming languages we use today, lol.

    TripleII
    TripleII-21189418044173169409978279405827
  • RE: Moore's Wall

    Hi Robin,
    As you state, manufacturing process improvements will continue to reduce power and provide higher integration (such as graphics integration with the regular processor ? which also provides performance and power improvements). Moore?s Law and House?s Law (for performance) are observations of human ingenuity and I expect them to continue. There are interesting products on the horizon but, yes, it's not just GHz any more. :)
    JamesCAbel
  • Bigger die...

    Just give me a twin CPU or a 1KW 5inch wide die for a GPU. I dont care as long the performance is there. Electricity is no object. Anyway, people have been predicting this for the last 20 years...
    Tommy S.
  • RE: Moore's Wall

    Moore's Law hasn't hit a wall, its just found a different playground - mobile.
    CrazySaint
  • RE: Moore's Wall

    Interesting but I suspect cleaver ways will be found to improve hardware beyond things that we currently see as limiting, my main thoughts on this are that it's software that's lagging behind, I remember back when you had to worry about having a pc that was high spec enough to run something, Now I'm running things like folding@home in the background because there's not much out there that utilizes more than a fraction of my pc's potential.
    Big_Belly_Bob
  • RE: Moore's Wall

    The striving for energy efficiency is definitely slowing down the power curve some at this time. A lot of time and money is being spent on making CPUs, video card and hard disks more energy efficient and that is effort not being put into more power.

    Energy efficiency is becoming an increasingly important factor in buying decisions of businesses and individuals. Server farms are leading the call for energy efficiency but it effecting decisions across the board. It affected the CPU and hard disk in my latest desktop and greatly affected my video card choice. I also have the quietest computer I ever had and that is a nice bonus.
    Mythos7
  • Software EFFICIENCY?

    Software is at the stage of development of the early gas guzzlers of the 1920's. Look at the memory and cpu hog Windows is!
    I DON'T NEED OR USE 90% OF WHAT IT CONTINUOUSLY DOES on a daily basis yet Windows Embeded is a nich market. We are getting to the "1973 gas shortage" of computing power. We have to learn to do more with less. (... and YES, Linux, OS-X and Unix are in the same boat. Does anybody program in machine language any more?)
    kd5auq
    • No, and for good reason

      "Does anybody program in machine language any more?"

      With today's complex CPU it is near impossible for developers to optimize code to run a quickly as an optimize compiler from the manufacturer for all but the simPliest tasks.

      For someone who started out on assembler this is a good thing.

      Maybe you'd like to nominate the features you'd like dropped from a modern OS like mac OS x?
      Richard Flude
      • RE: Moore's Wall

        @Richard Flude,

        Why don't we just drop Mac OSX? (just kidding...couldn't resist) :)
        bmonsterman
      • Today's automobiles did not become more effecient by dropping features!

        @Richard Flude
        In fact today's automobiles have even more features unheard of in the 70's. The point is that software (especially OSes) need to become MORE EFFECIENT in using CPU resources. With ever more powerfull and faster CPUs this has not been a priority til now.
        kd5auq
    • Bloatware sucks up all available power

      @kd5auq

      I have a nine year old PC that can boot up and open a Word document faster than my slick new multi-core with 8 times the memory. The SOC implementation of that entire system (including disk storage!) would cost less than $50 today in volume.

      Microsoft is the biggest culprit in software inefficiency, but they are not by any means alone in that failing. Windows 7 loads up drivers, libraries, and runs programs for thousands of "features" that I will never use. The wasted time and resources boggles the mind.
      terry flores
      • Win 7, etc

        @terry flores
        Precisely why I made a reference to Windows Embedded that uses ONLY the resources specified. Unix and Linux can also be configured to delete unneeded stuff. I have no experience with OS-X.
        Of course I realize that this is not for the average home PC.
        kd5auq