Currently, a Memory1 DIMM is 128GB, but expect 256GB versions later this year. Even with the current kit, a 1U server can support up to 2TB of physical memory - perfect for Big Data applications.
Big Data apps hammer storage subsystems because their bandwidth demands swamp high-end I/O capacities. Putting mass storage on the even faster memory bus solves that problem.
You could use 128GB DRAM DIMMs, but they are both costly and power/cooling hungry. Memory1 uses NAND flash, which is much cheaper than DRAM, in conjunction with their Linux DMX driver, to provide massive capacity at an affordable price, while using far less power.
Tests run on the Apache Spark analytics platform show what massive memory capacity can do for cloud economics. A financial customer ran a Spark SQL test and found they got 24 percent faster response times with one quarter the number of servers. Obviously, that's a major financial benefit.
In another customer benchmark, the large capacity of Memory1 increased MySQL performance by 13x, response time latency by 10x, while reducing capital expense by 86 percent. That last number will put a smile on even the most dour CFO's face.
The Storage Bits take
After decades in the industry, I'm a confirmed skeptic of all shiny new technologies. But I see a straight line benefit from high-capacity flash-based memory to your bottom line: more performance; fewer servers; lower costs; equals more bang for the buck.
This is NOT a PC technology today, but Diablo stresses that their DMX driver, which handles data placement to maximize performance, is technology independent. When and if 3D XPoint makes it to the market, or any other DRAM substitute, such as Nantero's carbon nanotube memory, DMX can optimize it for server use.
If Big Data is giving you headaches, you owe it to yourself to see what Diablo's Memory1 can do for your servers. In a world where CPU performance is flat, putting massive capacity on the memory bus is the Next Big Thing for performance and cost improvement.
Courteous comments welcome, of course.