Crossbar optimizes NVRAM for machine learning

Disk drives in 1957. Flash in 2005. New storage technologies don't come along very often. But resistance RAM - ReRAM - pushed by Intel and others, is finally hitting the mainstream. With applications ranging from embedded systems to fast SSDs, the latest announcement is great news for anyone who wants better storage.

executive guide

What is machine learning? Everything you need to know

Here's how it's related to artificial intelligence, how it works and why it matters.

Read More

Crossbar, the Silicon Valley startup pushing a ReRAM of their own design, announced yesterday that $8B Microsemi has licensed their technology. Microsemi manufactures a wide range of components for aerospace, defense, automotive, and industrial applications. They believe they can build Crossbar's ReRAM on the latest 7nm processes.

Machine learning at the edge

You won't see ReRAM in a laptop very soon. It's too expensive today. But with volume growth, prices will come down. So where will the volume growth come from?

Niche applications are one, where the need for non-volatility, endurance, performance, and ruggedness outweighs cost. But the exciting area, poised for hyper growth, is machine learning at the edge.

Machine learning is a large subset of Artificial Intelligence or AI. The key idea of ML is that the machine "learns" to perform a specific task, like facial recognition, without painstakingly written instructions. Of course, identifying faces is almost trivial today: almost every digital camera can "see" faces and even react to blinking.

The bigger problem is doing ML at the edge. The edge might be a smartphone or a self-driving car. The key is that the work needs to be done locally to avoid latency problems.

ML I/O requirements

Crossbar has also been working on ReRAM optimized for ML. They say they've filed patents, but none of the applications are public yet. What they are willing to say is that very wide memory busses are needed to optimize for ML.

Instead of bringing up one face at a time, a very wide memory bus enables the processor to bring up dozens of faces at once, dramatically speeding the process. Using their ReRAM they've designed NVRAM modules that provide the capability of classifying 100,000 faces in one pass through the ML neural network. That's amazing.

Vendors are well aware of the need for wide memory busses. That's why the Wide I/O 2 (Samsung), Hybrid Memory Cube (Intel/Micron), and High Bandwidth Memory (Nvidia/AMD), proposals are battling for broad industry support. The latter two support bandwidth in excess of 100GB/sec, compared to DDR4-3200 - the latest and greatest - 25GB/sec bandwidth.

At those speeds, power consumption, especially for mobile devices, becomes critical. Which is why low-power NVRAM (non-volatile RAM), such as Crossbar's, is arriving just in time.

The Storage Bits take

Crossbar's deal with Microsemi is just the start of the NVRAM races. Expect to see more announcements around competing technologies from Nantero, Everspin, and others in the near future.

Ultimately, it will be NVRAM's lower power consumption that will drive adoption in the mobile world. And since the world is going mobile, the world will be using NVRAM within a decade.

Courteous comments welcome, of course.