The 16 TB RAM PC: when?

The 16 TB RAM PC: when?

Summary: The next version of Mac OS X will address 16 TB of RAM. Who will ever have 16 TB - 16,000 GB - of RAM on a home computer?

SHARE:
29

The next version of Mac OS X will address 16 TB of RAM. Who will ever have 16 TB - 16,000 GB - of RAM on a home computer? If the past is any guide, some of us will be using 16 TB PCs in 2025.

RAM is the purest expression of Moore's Law The definition of Moore's Law is "roughly double the number of transistors every 24 months."

My 1st computer: 4k or 16k? My first computer - bought 30 years ago - was the original Apple ][. The big choice was the amount of RAM. As befits a future storage geek, I splurged for 16K of RAM, handy if you wanted to use floating point BASIC.

Fast forward Earlier this week I ordered another 4 GB RAM to bring the quad-core Pro up to 8 GB of RAM, hoping to speed up video compression and transcoding. Not to mention bragging rights.

From 16 KB to 8 GB: that's 5,000x in 30 years. That's somewhere between a 4x and 5x increase every 5 years - roughly in line with Moore's Law. 2x every 2 years is 4x every 4 years.

It is probably on the low side because I spent a lot more for RAM then than I do today.

Address space consumption One of my all-time favorite storage papers, by the late, great Jim Gray and Preshant Shenoy, Rules of Thumb in Data Engineering, observes that we use another bit of address space every 18 months.

Let's apply that empirical rule to my Mac.

2^33 = 8 GB 2^44 = 16 TB

That's 11 bits of address space which should take, roughly, 17 years to "consume." That means that in 2025 I'll be sitting down to a 16 TB Mac - or something better if it comes along - and editing my 3D 8k x 8k virtual world movie. Or something.

But what about you? Let's say you have a notebook with 1 GB RAM and you are happy with it. That's 30 bits of address space. So for you to consume another 14 bits of address space will take 21 years. So you won't need a 16 TB notebook until 2029.

Start saving now.

The Storage Bits take Even the top chip technologists can't see beyond 10 years, so we have no idea if Moore's law will continue to hold until 2025. It may not be physically possible to build 4 TB DIMMs - or whatever they are then - in 2025.

But if it is, you'll have a really nice PC.

Comments welcome, of course. I'm off to the Seattle Conference on Scalability - sponsored by Google - today. Hope to see some good, storage consuming, stuff there.

Topics: Hardware, Processors, Storage

Kick off your day with ZDNet's daily email newsletter. It's the freshest tech news and opinion, served hot. Get it.

Talkback

29 comments
Log in or register to join the discussion
  • Yes, but some of us suffered so long with the 640K limitation. Those uper

    memory tricks, etc, made programming a lot more tricky in those days.
    DonnieBoy
  • We splurged on the 16K RAM for our //e...

    We splurged on the 16K RAM for our //e, as well as the 80
    column card and Microsoft BASIC card. That was one smokn'
    Apple II. We even had a data coupler and that new fangled
    mouse pointing device. :)
    olePigeon
  • Stop perpetuating the ERROR

    The result of going from 32 bit addressing to 64 bit addressing is to increases adressable memory to 16 EXABYTES -- NOT 16 Terbytes. Where did this mistake start from here on ZDNet? It's spreading and it's WRONG!!!!

    mega- M 10^6 ~2^20
    giga- G 10^9 ~2^30
    tera- T 10^12 ~2^40
    peta- P 10^15 ~2^50
    exa- E 10^18 ~2^60

    The limit in 32 bit addressing was 2^32, or 4GB.

    The limit in 64 bit addressing will be 2^64, or 16EB.

    Theoretical RAM space will not double, or 500x, or 5000x, but will increase by a factor of 4 Billion.

    Going from 32-bit to 64-bit addressing doesn't double address space, it SQUARES it. (4GB squared is 16 Exabytes is another way to see the calculation).

    2^64 = (2^32)^2 (two to the 32 squared)

    *sigh* This is a technical web site and this is SIMPLE math. Argh!!

    The Alken dude
    The Alken Dude
    • Dude. Chill.

      Deep breaths there, guy. No need to have a coronary over "simple math". It's not worth it. ;)

      Now, then. The reason for the 16TB that Robin mentions isn't because he slipped a few digits on his calculator in figuring out 64-bit memory addressing. The reason he brings up 16TB is because in the marketing blurb he referenced in his article -- http://www.apple.com/macosx/snowleopard/?sr=hotnews?sr=hotnews.rss in case you missed it -- they specifically state that the theoretical memory limitation of OS X Snow Leopard is "up to a theoretical 16TB, or 500 times more than what is possible today".

      Now, whether that's a limitation of the memory handling of the next OS X, or the folks at Apple marketing slipping a few digits on that calculator, that is certainly up for debate, and you can feel free to try to set them straight on it, though good luck getting through the famous Steve Jobs Reality Distortion Field, which has been known to bend, time, space and mathematics before. ;) But, as for where ZDNet gets that number that they've been tossing around as far as the memory address space of the next OS X, right or wrong, that'd be Apple's marketing department, not ZDNet's mathemeticians.
      Whyaylooh
    • Give it up.

      These are the same folks who think 2^30 = 10^9 and complain
      when their computer manufacturer says 250 GB, but their
      computer reports 232 GB.
      frgough
    • Your math is wrong also

      Actually 4GB squared is 16GB not 16EB
      tommcd64
      • Actually, your math is wrong

        4000000000
        * 4000000000
        --------------------------
        = 16000000000000000000

        i.e., 4GB squared is 16EB.
        cabdriverjim
  • Not likely

    It's not that such memory figures won't be possible... it's just that today I still see too many people with 1 gig machines.

    The reality is the top 3 things people do at home and work don't require such enormous amounts of memory.

    Home

    1)Web browsing
    2)Email
    3)Games

    Work

    1) Web applications
    2) Office productivity software
    3) Email

    Video processing can leverage enormous amounts of memory but that is niche. I've personally never had to do such work.

    I've had 4 gigs in a machine I built almost two years ago and iterating, the office computers haven't caught up. (yet)

    Never mind Joe and Jane average, they're usually even more behind.

    Server side is another matter. There's no such thing as "too much". Having said that, some of the price quotes I see for Sun hardware as you add the gigs of memory (64 gigs) come close to six figures.

    -M
    betelgeuse68
    • re: home usage

      I would argue that home gamers, esp followers of MMOG's such as the new "Age of Conan" would disagree about the demands being made for RAM on their home PC's.....youd need about 4 gigs these days just to get by [thats 2 gigs for the game, 2 gigs for Vista, naturally]
      2WiReD
  • A Sign of Bad Programming?

    Doesn't this, if it does happen, show how much more lax programming standards will become and how careless we'll be with resources. Even today, does a browser really need to use over 100+ MB or ram when you are browsing around the internet? Some areas will always increase, OSes providing more services, photo and video editors ever more powerful and resource hungry, but doesn't there have to be limit before we have software that is nothing more than memory hogs which do nothing more than what we have today?
    clindhartsen
    • Depends....

      ...On if you were going to use that 2025 era computer with a 2001 OS, then, it most likely would be overkill.

      On the other hand, who knows WHAT software will be like 17 years down the road from now. Will we all be using a dozen concurrent virtual machines, each doing various tasks? Or will we be going down the road of artificial intelligence? Or will we be dealing with holodeck style technology?

      Hard to say. My magic 8-ball keeps coming back with "Check back later."
      Wolfie2K3
    • Bad programming, evolutionary, pardigm shift....

      As a developer I tend to agree that trends toward simply consuming available performance and space resources is out of control.

      Is programming bad simply because it consumes more resources than equivalent versions from the past? That may be the result of business decisions, not engineering choices, and so the programming might be as good, just the choices are 'fatter' to get products out the door using tools that are easier to use.

      Bad programming generally results in crashes or unexpected behavior. It is the errant execution of a design. Bad design is one part that introduces these huge RAM and CPU demands we don't usually require, and 'lazy' choices in development tools create fat, slow programs. These are in the domain of business choices more than engineering choices.

      Non-functional or esoteric features imposed on us by business choices are part of the problem. I don't use at least 80% of what Excel does. The Windows 3.1 version of Excel would do just fine for me. It wasn't bad programming that turned Excel into it's recent gargantuan size - it was an interest to include ever increasing features and base some of it in .NET that did that.

      Even XP is fat compared to Windows 2000. There's hardly any technical service provided by XP that couldn't have been provided in Windows 2000, they simply chose to make XP a new product for business reasons. It was, after all, simply an incremental upgrade from the same codebase.

      Vista, on the other hand, is a classic example of design leaving the rails. Its system requirements are many times heavier than XP in some areas (disk space, notably, for the OS install is one example). Unless it crashes more often, though, it's not the result of bad programming - my complains about Vista are the result of design errors, many prompted by business decisions which may even turn out to be poor choices. Reworking XP to impose DRM from the interior outward was probably too difficult, so a ground up re-write seemed appropriate to them at the time. Competing interests from consumers were ignored, and the results can be unpleasant for some of us (and others may not realize this until some time later).

      Other factors tie us into these themes. The Word document is an important standard, but it's a closed standard, so nothing but Word is entirely adequate to use it's various versions. MP3's with DRM tie us to particular devices, and limit our ability to move that to upgraded software or hardware, making the device it was originally 'assigned' to as important as the mp3 collection it plays.

      When all of our important 3rd party applications (like Photoshop, 3DS Max, AutoCAD) are only available in Windows versions, the operating system ties us into a platform, and when XP was dropped our future requirement for Vista is sealed, unless we can appeal to the 3rd party developers to release non-Windows versions that give us choices. We can only hope they'll realize the can be more important to our use of the computer than the operating system ever was, and give us options by creating versions independent of the OS, the way the web itself has.

      Sometimes, though, the size increases are the result of improvements in development, not laziness or carelessness. In browsers, the trend toward exception safe development generates fatter programs, but they're more reliable. Guarding against buffer overflow attacks, for example, required the adoption of containers instead of older C style buffers which were prone to that attack, and everywhere that is done a small penalty in size is the cost. This is actually maturity at work.

      Also, in browsers in particular, the many different standards and technologies that must be supported all conspire to demand more resources than ever before. As a generalized rendering engine, the browser does have a lot more to do now that it did back in the late 90's. Combined with multiple paged browsing, flash video, silver light, 3rd part toolbars - the bloat is understandable. You can find older style browsers that are smaller, and provide access to perhaps 80% of the web, but not the new and explosive trends.

      However, it may be true that optimization opportunities have been skipped, even wholesale, in browsers and other applications, simply because the 'iron has gotten cheap." Back when I personally paid $500 for a 32Mbyte RAM chip, I was appalled to discover that my C++ compiler required 64Mbytes to keep from thrashing the VM. I had $1,000 worth of RAM in my machine and it was STILL laboring - because the compiler implemented new language features, which are actually important ones, and these new features came at a cost.

      Now, when a developer considers the notion of optimizing storage, we consider that a 2Gbyte RAM chip is perhaps $25. The 100Mbyte consumption of a typical browsing session is consuming only about $1.50 worth of resources, but it would cost perhaps thousands of dollars to drop that by 20%, saving only about 20 cents worth of RAM.

      This isn't lazy, it's practical. It may even be a good idea, especially if the optimization introduced some reduced reliability in the application (which it often can).

      I still agree, though. I don't need a 300Mbyte Excel package for my own spreadsheet needs, and the old Word 6.0 would still be fine for my purposes if it could read the 'current' version of a word document file.

      I'm among the last to say "just go along" - I ran Windows 2000 until 2007, but I've moved to XP 64 (and I'd be in Linux 64 of my applications would run there) - and I moved to 64bit computing because of RAM requirements for image editing, compilation, multiple OS testing - all while browsing, researching, reading, listening and transcoding. My use may exceed that of average consumers, but believe me when I say the idea of a hardware limitation stopping my productive activities (and not so productive ones) is enough to make me visit my favorite hardware vendor to plan an upgrade.
      JasonVene
      • And in summary...

        "I could write this so much better if I didn't have to make it to anyone else's specification."

        Here's the thing, you're a code monkey, they pay you, you write code, you get banana. That simple. Don't like it, go make your own company and see how far you get on 1995 time to market writing everything in C and optimizing it for 15 year old hardware.

        The best written program on earth is useless if it's never released, and your mode of thinking results in just that type of scenario.
        Spiritusindomit
  • Never. You're thinking too small.

    30 years ago "PC's" with 4K of RAM and cassette tape storage weren't good for much but playing with programming in Basic and a few simple games.

    20 years ago, we were measuring RAM in hundreds of Kilobytes and disk space in tens of megabytes, and starting to play our games in 320x200x256 colors on $500 14" monitors, when we weren't using them for text mode word processing and spreadsheets.

    Now, we can spend 1/4 as much money and buy machines with 4GB of RAM and a Terabyte of drive space. We run lots of different apps because they have very similar UIs, and spend the majority of our time on the web and consuming multi-media. Our PCs are hardly the same thing they were 20 or 30 years ago.

    20 years from now, there will still be PCs - dirt cheap little devices with flat screens and sometimes keyboards, as ubiquitous as telephones and still used for communicating and information access- but over the internet. They won't have 16TB of RAM because they won't need it.

    But we'll also have something else- networked, totally immersive virtual reality generators that let us interact with people ANYWHERE as if we were standing next to them. THESE machines might need 16TB of RAM, or more likely, a few thousand TB of Memristor memory or whatever other technology comes along and makes RAM and spinning media seem quaint. Will we still call these "PCs"? Maybe, but they won't resemble the hardware and software we have today, and we won't use them for the same things.
    Steve Summers
  • An observation...

    Why are we wasting our time conjecturing on the ways that end users will cope we probably need to face the fact that however powerful we make machines the environment will find a way to consume the resources. Unless, like clindhartsen says, we force developers to contain themselves and write efficient code, it won't really matter.

    We Americans just like things unnecessarily big and powerful. I'll guarantee that there will be legit uses for that kind of computing power, but it isn't likely that everyone will need it. I for one run linux on mostly single core P4 machines with excellent results. Adequate video memory goes a long way to aid in rendering the 3D GUI, but other than that....so what?

    The big powerful desktops may give way completely to the smartphone and what we consider UMPCs for normal use by household users. Those devices already have enough power to do most of the things that The Alken Dude was talking about...except for the screen real estate. However, making that sacrifice is one that allows for extreme mobility. A Treo with Docs To Go is a pretty powerful little machine...considering that this conversation started out talking about 4-16k Apples. Even with the service costs those devices are cheaper than computers 25-30 years ago. What do we really need anyway? I'm actually fairly happy with the state of computing and am more curious than excited to see where we go from here.
    nwoodson1
    • ...to see where we go from here.

      There are two major factors pushing computer hardware ability for the average household user. One, of course, is VIDEO GAMES. More realistic, faster, smoother, sharper games require space and specs. Compression helps but in the end, we're trying to make games into interactive movies and that requires space and spec. Space to fit all those audio files, maps, textures, etc... spec to run it smoothly.

      If someone out there can take Crysis, strip it down and reengineer the engine to run on a Pentium 2 with 8mb of video ram, using Linux and a cheap 16bit sound card... and make it look almost as good as it does now... GO FOR IT. But that won't happen. It's not UT99, it's a big needy engine. Not as bad as Fallout3 or other similar games but... it's needy.

      TWO, the second thing that pushes computers in the direction they're headed in... is a mixture of two things. File type and file sharing. We want 10megapixil pictures of our vacation, sound clips, music that is in studio quality formats... and we want ways to share them with our phone, our friends and the rest of the universe, via computer. Suddenly, massive servers and datacenters spring up to bring the world whatever it is that they want to download in order to have the things they want. People become digital packrats. Sure, a text document still takes up hardly a few bytes... a big one takes up a few KB. But downloading a movie from Amazon so you don't have to get up and rent it... that's motion video with syncronized sound... it takes up a lot. A home theater PC becomes a vault for movies.

      One day, people will want a video game that needs a video card that will render in real time what a modern rendering farm takes a day to do.

      One day, people are going to want to unload their 16gb SDHC camera card because it only held thirty minutes of high definition footage... or worse yet, they'll need a 64gb camera card in order to fit half an hour of grand canyon footage, recorded at a resolution of 1920x1080 at 60 frames per second... but why settle for that? SED TV techology will supposedly put out 4 TIMES the resolution of 1080 progressive, so... why not make a camcorder that will take advantage of it?

      In the end, space and spec are pushed because the modern user ENJOYS doing bigger and better things.

      One day, the average camcorder resolution will be so awesome that a glance in a rearview mirror of a moving vehicle at 150 yard away can be paused to show the reflection of a man's face, reflected from 50 yard out... then you'll be able to blow up the image and see the reflection of the face without distortion... not quite the ability to see the head of his zit, but close enough to tell he's got a zit without it being grainy.

      And it's not just americans who want bigger and better technology... it's everyone. The demand is high enough that China and Japan are churning it out, as well as Americans. Koreans have 100mbs fiber optic internet2; we've got...what's FIOS at now? 30mbs? 50 in some areas? Maybe?

      And yet many Americans still use dialup because it's "good enough". :)

      KitWriter
  • Bah!! 640K should be enough for anyone!!

    ;) Well, only half winking. We really should not be demanding the RAM that we are...apps and the OS especially, are far, far to heavy for what they do. Trim the fat already!!!
    Techboy_z
    • Agreed, Nobody will ever need more than 640K

      Thank you Bill Gates.
      jorjitop
      • Bill Gates?

        Surely the IBM PC architecture was the problem..? The original 8088 CPU could only address 1MB and the top portion was reserved for hardware use, leaving 640K. I don't think Bill Gates is to blame for that.
        barrylb
        • That's not important,

          what's important is that we slander Bill Gates in every way possible to make people hate him, what are you, a M$ shill?
          jamesrayg