X
Business

HPE unveils memory-driven computing prototype with 160TB of memory

The prototype is the next step in turning The Machine research project into a product. Here's a look at HPE's memory-driven efforts.
Written by Larry Dignan, Contributor

Hewlett Packard Enterprise outlined a prototype system with 160TB of memory with what it hopes will usher in an innovation cadence from its research labs to product.

The effort revolves around memory-driven computing, an architecture that's aimed at big data workloads. This prototype was unveiled Tuesday under HPE's The Machine research project. Memory-driven computing puts memory at the center of the architecture instead of the processor. The goal is to eliminate inefficiencies in how memory, storage and processors interact today to solve problems faster.

hpe-the-machine-architecture.png


Now this memory-driven prototype won't be a product soon, but does illustrate how HPE's move to jumpstart its research and development efforts may soon be bearing fruit. HPE started to look at new architectures to see what computing needs to be as Moore's Law peters out. The Internet of Things, machine learning and artificial intelligence will require new computing models.

Last year, HPE demonstrated the memory-driven system and CEO Meg Whitman said The Machine research project is critical to the company's growth prospects.

Read also: The Machine by Hewlett Packard Enterprise: The smart person's guide (TechRepublic) | HPE hasn't abandoned OpenStack, releases Helion OpenStack 5.0 | 136 complaints arise from ATO's string of HPE hardware outages | SUSE acquires HPE's cloud assets

HPE is hoping its memory-driven computing efforts will boost enterprise applications. For instance, HPE outlined how its memory-driven computing architecture will mesh with SAP S/4HANA suite. HPE said the architecture will provide scale for in-memory computing, modularity and reliability.

160-tb-prototype-hero-shot.jpg


Kirk Bresniker, chief architect at Hewlett Packard Labs and HPE fellow, said the memory-driven system highlights how "our labs can successfully turn on and hand over our rackscale prototype." "From here our applications development team will kick the tires," he said.

The specs include:

  • 160TB of shared memory across 40 physical nodes interconnected via a fabric protocol.
  • A Linux-based operating system running on Thunder X2, an ARM-based system on a chip from Cavium.
  • Photonics and optical communication links including the new X1 photonics module.
  • Software programming tools to utilize persistent memory.
  • Systems like HPE's prototype are aimed for analytics and correlating data for engineering, science, economics and Monte Carlo analysis.

Bresniker said HPE's goal with memory-driven computing is to make its conventional products better for things like network functions virtualization. Over the longer term, memory-driven computing will also be shared by the Gen Z consortium, which is an alliance of technology companies looking to advance new architectures.

Editorial standards