Why doesn't Intel put DRAM on the CPU?

Why doesn't Intel put DRAM on the CPU?

Summary: They keep cramming more transistors on chips every year while performance gains have slowed to a crawl. Why not put DRAM on-chip to speed up CPUs?

SHARE:
TOPICS: Storage, Intel
8

The System-On-a-Chip (SOC) market has been around for years. Isn't it obvious that on-chip DRAM would reduce access times and increase performance? I asked my friend and fellow analyst Jim Handy of Objective Analysis, who's been following semiconductors for decades, why chip vendors don't. 

I summarize his reasoning here. The answer is simple but the reasons are illuminating.

Process

DRAM is commonly built on a logic process that is customized for its special requirements. For example, DRAM needs a good capacitor that is characterized for leakage to know how often the DRAM needs refreshing.

That's a lengthy and expensive process. It's easier to put Static RAM (SRAM) on the CPU instead of DRAM because SRAM doesn't need finicky capacitors.

Special Feature

Storage: Fear, Loss, and Innovation in 2014

Storage: Fear, Loss, and Innovation in 2014

The rise of big data and the demand for real-time information is putting more pressure than ever on enterprise storage. We look at the technologies that enterprises are using to keep up, from SSDs to storage virtualization to network and data center transformation.

Logic processes - those used for CPUs - are also more expensive. A logic wafer might cost $3500 vs $1600 for DRAM. Intel's logic wafers may cost as much $5k. That's costly real estate.

Size

Another cost difference is that the cell (bit) size will be larger if you don't use a process customized for DRAM. So putting DRAM on a logic chip is a double-whammy: larger cell sizes on more expensive wafers. Not a winning combination.

SRAM actually requires more transistors per bit than DRAM as it uses 4-6 transistors vs 1 transistor and a capacitor for DRAM. But since you're already putting a couple of billion transistors on the chip, that isn't much of a problem. And SRAM has another huge advantage.

Speed

While SRAM is easier to build on a logic wafer, it also has one other huge advantage: speed. Access times are roughly 10 to 20 times faster than DRAM.

Jim summarized the answer very simply:

There's a reason people don't put main memory onto their chips, and that's because it's always significantly cheaper to use separate memory chips.

The Storage Bits take

Cost pressures are ferocious throughout the storage hierarchy - which is why we have a storage hierarchy. If the fastest and most reliable storage was also the cheapest, the "hierarchy" would be only one layer deep.

Flash memory had no impact on computer storage for decades until it got cheaper than DRAM. Tape - which once dominated computer storage - hangs on because it is cheap.

As long as the speed and cost correlation continues, we'll have a storage hierarchy. And no DRAM on CPUs.

Comments welcome, as always.

Topics: Storage, Intel

Kick off your day with ZDNet's daily email newsletter. It's the freshest tech news and opinion, served hot. Get it.

Talkback

8 comments
Log in or register to join the discussion
  • Complicates expansion.

    Another reason you don't see it on the CPU is because it would complicate memory expansion. Intel CPUs are used in systems ranging from 4GB to 256GB of RAM space. Accessing all of the RAM the same way allows for a simpler pipeline than having RAM split between on chip and off chip locations. The latency and timing wouldn't be the same, so you would likely need two different sets of management circuitry. RAM access spread across multiple buses would introduce one more layer of complexity. The more complicated the circuitry, the more fragile it becomes. The potential problems and additional cost just don't justify the risk in desktop CPUs which are already fast enough. Now, if we're talking about super computing, that's another story. Speed trumps everything in that arena.
    BillDem
    • Going further

      It about cost . There are different power requirements for Dram and CPU that will be a engineering and logistical nightmare. How do you handle hibernation when the CPU core is powered down abut Dram still has power and refreshing. What if one found the CPU DRAM works at 1GHz but not 1.2 GHz; it easy just to change the BOM for a different DRAM but you have scrap the die and redesign and validate it if DRAM is on the same die as the CPU.
      Richardbz
  • Cost is also why it works for SoC.

    The systems on a chip are slower, thus able to make do with slower memory.

    Even so, the memory is limited in size due to space considerations.
    jessepollard
  • Heat

    The package might run just a wee bit warm.
    Fyrewerx
  • Its not just cost

    "There's a reason people don't put main memory onto their chips, and that's because it's always significantly cheaper to use separate memory chips."

    Intel make powerful processors; even if it is only to feed the insatiable appetite of Microsoft. The market is addicted to Moore's Law.

    On the other hand, how many people look forward to having to learn a new office interface or a new OS desktop?

    If the market said, 'Stop, what we have is all we need', then Intel could use next year's technology to put last year's PC on a chip. An 8088, with 640K of RAM, and all the goodies, on a single chip seems trivial by modern standards. A 486 with 4M of RAM (so cutting edge in its day) would not be a problem.

    What we would probably have, if you wanted it now, would be a quad core ARM, with 2G of RAM, and a GPU, on a chip. It probably would not be made by Intel, probably would not be running Microsoft software, and would probably be in tablet or phone rather than a desktop.

    The question then becomes what happens to the desktop? Once people become familiar with doing serious stuff on a tablet, it is only a matter of time before they want the same experience on the desktop. Not a problem.

    There would still be a market for machines with ever more powerful processors and larger memories but the ordinary office PC may be reaching the end of the line.

    On a practical level, CPU manufacture and DRAM manufacture have been going separate ways for a long time. FLASH and CPU logic seem to go together on single chip microcontrollers, but these seem to always have a rather limited amount of static RAM on board - not DRAM.

    A simple fudge is to stick the DRAM on top of the CPU chip - like the Broadcom device in the Rapsberry Pi.

    What is really needed is a memory technology that is non-volatile, does not wear out, is fast as static cache RAM, and only takes a single transistor per bit, can be made in large sizes, is compatible with CPU technology, and is cheap.

    Of course the whole article is a troll because Intel is already putting DRAM on the CPU chip - but not perhaps for the reason, or in the way, that the article proposes.
    Alan(UK)
    • @Alan(UK)

      OMG. You have to be kidding.

      Your post started off sounding interesting then it quickly drifted off into never never land of the "me and my people wish..".

      " would be a quad core ARM, with 2G of RAM, and a GPU, on a chip. It probably would not be made by Intel, probably would not be running Microsoft software, and would probably be in tablet or phone rather than a desktop"

      The problem is that nobody is really wishing for such a system except for people who don't like Windows. And as we know, although they all talk to make it sound different, the people who don't want Windows are in the minority. The vast minority.

      Sure...the ABM'ers sit around and lament about how if you tell people how god awful and horrible Windows is for them and how its the bane of their lives and ask those people if they would like a better OS the people would surly say "YES!"

      But then you sit them down in front of a non Windows OS and say here it is, and I promise you that they will say "no..thats not it, you said better, not different".

      Unfortunately for the Windows nay-sayers, they are the few who hate Windows and the rest of the world don't mind it at all.

      Quad core ARM with 2 gig of ram and a GPU on a chip? Really? Sure, easily done from what I understand, but that's high end phone stuff. People who want or need at least one computer in their life that performs like a real computer without hesitation and can really work at tough intricate tasks with some ease when called upon do not want that.

      Honestly.

      And hardware any brand these days is cheap for some pretty decent stuff, so all this system on a chip stuff right now is largely for either mobile due to the need for miniaturization, or simply not need and is for show, saying look! "We can do it!!"

      Ya, a pretty determined clown can fit 50 of his buddies into a Mini Cooper as well, but its not the way to travel if you don't have to for good reason. Neither is "a quad core ARM, with 2G of RAM, and a GPU, on a chip" the way to go for computing unless you really have to for good reason.

      And you don't.
      Cayble
  • Seriously??

    "The question then becomes what happens to the desktop? Once people become familiar with doing serious stuff on a tablet, it is only a matter of time before they want the same experience on the desktop. Not a problem."

    Another piece of fanciful post-PC gibberish. As much as the romance of wandering around doing your day's work on a tablet might intrigue some people, the reality is that desktops/laptops will be around for generations doing the "serious stuff":

    - Tablet screens are and always will be too small for long-duration use. The human eye won't be evolving any time soon to suit post-PC timetables.
    - Large (and sometimes multiple) sreens are used a lot more than you might think in actual, productive office environments.
    - Even if the tablet is placed in a docking station and uses an externel screen and keyboard (a potential scenario of the future) then it's still basically a desktop which can be turned into a tablet for mobility. This is the one area where I would consider tablets to have some real desktop replacement potential in certain scenarios.
    - Your docked tablet is also still basically limited to tablet CPU power, which in many cases is just not enough. Ever tried running real, production-grade 3D CAD, video editing or even large batch file conversions in Adobe Lightroom on a tablet? People buy desktops for these applications for good reasons - speed and storage flexibility.
    - ARM might be leading the race on popularity right now but Intel is a lot like Microsoft, never count them out of a fight. ATOM CPU's generally outpace ARM and once Intel (almost inevitably) squeezes that extra bit of battery life from their designs then the battle will really be on. ARM still has a big lead in phones but on tablets the story will change much faster.
    joneda
    • Ya, it makes you bone weary some days...

      Listening to the "anything but Microsoft" crowd tell us how it could be inside the "World that only they want".

      Well, I have some information for the ABM crowd. As the years to come arrive, they are going to find fewer and fewer of those arguments against Microsoft becoming available to them.

      The fact is, and please, I dare someone to deny it, Microsoft is gradually pushing the world toward a "complete cloud solution". Once it arrives, and Im quite sure it wont be soon, but they are not waiting for anyone else to do it first, our hardware will likely be of the mighty mouse variety with a new pared down version of Windows that will enable us to get all of our apps, content and entertainment from the cloud and all our data we generate will be stored to the cloud all for a monthly, or yearly fee.

      It may well be we are not even leasing the OS anymore but the entire hardware package, and of course the OEM's love that thought too as it works great for the car industry, as situations will be set up that see you with new hardware more regularly than the XP effect allows for today.

      I suspect that Microsoft and their operating systems will be thriving in one version or another for a very very long time.
      Cayble