Pure Storage and Nvidia introduce AIRI, AI-Ready Infrastructure

AIRI combines Pure Storage's FlashBlade and the Nvidia DGX-1, promising about a 4x improvement in data scientist productivity.

Video: Samsung's new SSD: 33.2 TB of memory for enterprises and data centers

see also

Google, IBM, Dell EMC: We can make servers that are 10 times faster

A consortium of tech giants have developed a technical spec they say is needed if datacenters are to cope with the demands of machine learning and other data-intensive workloads.

Read More

After working alongside one another to help customers leverage AI, Pure Storage and Nvidia are teaming up on a product called AIRI, which stands for AI-Ready Infrastructure.

"We built AIRI for enterprises wondering how to get into AI," Pure Storage's Matt Burr told ZDNet.

While the product is borne out of the world of hyperscalers and academia, Burr noted that the needs of the enterprise are different. Enterprises, he said, are "focused on, 'How do I buy things that are relatively generic, such that I can apply them across a large topology of infrastructure space and yield a pretty solid result?'"

AIRI combines Pure FlashBlade and Nvidia DGX-1 to produce what's essentially AI in a box. Companies have already been using FlashBlade with Nvidia technology, which made AIRI a natural solution.

"We reached a point where we said if these really smart companies... were looking for simplified infrastructure to onramp to artificial intelligence and analytics, AIRI is probably the answer," Burr said.

airi-rack-15-alpha.png


Pure Storage and Nvidia say that AIRI offers companies about a four times improvement in data scientist productivity. This is accomplished by the Nvidia GPU Cloud Deep Learning Stack -- the container-based framework that allows for the adoption of frameworks like TensorFlow -- in combination with the AIRI scaling toolkit from Pure Storage, which allows for the simple implementation of multi-node DGX training.

AIRI also takes 50 racks of both compute and storage and minimizes it to something that's just over half a rack.

"It's basically giving the performance of an entire data science center... to one data scientist team in something less than a full rack in size and scale," Burr said. "No organization can take the money it takes to do this today. To give every one of their data scientists this kind of power, it's a really meaningful jump."