X
Innovation

AI hardware pioneer Cerebras expands access in partnership with cloud vendor Cirrascale

Renting capacity a week at a time, new users of Cerebras's AI machine will have a lower entry point to use a machine that can replace clusters of GPUs for deep learning training.
Written by Tiernan Ray, Senior Contributing Writer

The battle for artificial intelligence hardware keeps moving through phases. Three years ago, chip startups such as Habana Labs, Graphcore, and Cerebras Systems grabbed the spotlight with special semiconductors designed expressly for deep learning. 

Those vendors then moved on to selling whole systems, with newcomers such as SambaNova Systems starting out with that premise.

Now, the action is proceeding to a new phase, where vendors are partnering with cloud operators to challenge the entrenched place of Nvidia as the vendor of choice in cloud AI. 

Cerebras on Thursday announced a partnership with cloud operator Cirrascale to allow users to rent capacity on Cerebras's CS-2 AI machine running in Cirrascale cloud data centers. 

The move could expand access to the Cerebras machine, which is priced far beyond the budget of some researchers.

"I think, in all fairness, one of the knocks on us is it's a big bite, in that the machine costs several million dollars," said Cerebras CEO Andrew Feldman in a press briefing during the AI Hardware Summit event taking place this week at the Computer History Museum in Silicon Valley. 

Cerebras has already offered various leasing and financing plans for sales of the CS-2, noted Feldman. 

"We have allowed you to subscribe to the machine in short term or long term," he said. "This is an effort to take that to the next level."

The service is available starting immediately, said the two companies. The service, for use of a dedicated CS-2, has a one-week usage minimum and costs $60,000 per week, or $180,000 per month or $1.65 million a year. 

cs-2-chassis-right.jpg

The WSE-2 is in a new version of the company's dedicated AI computer, the CS-2. It features four fans on the bottom, and two pumps for the cooling system, on the upper right. Power supplies sit to the left, and ethernet ports above them.

Cerebras System

The monthly rate breaks equates to $246.58 per hour, while the annual rate equates to $188.36 an hour, according to Cirrascale. 

More details are available on the Cirrascale Web site.

The CS-2, the second version of Cerebras's computer system, introduced earlier this year, is a refrigerator-sized machine with a special cooling system that runs the second version of its Wafer Scale Engine, the world's largest chip. The machine can also be coupled to a special routing engine, the SwarmX, and a special memory store, the MemoryX

The machine is designed expressly for machine learning training, the most compute-intensive part of AI.

For Cirrascale, the virtue is to bring "the most powerful building block that's available out there" for training deep learning forms of AI, said the company's CEO of cloud services, PJ Go, in the same press briefing.

Using the single CS-2 machine, which is equivalent to racks of equipment, will substantially avoid the complexity of using clusters of many GPUs from Nvidia. Such clusters typically require data centers with special high-capacity, low latency links between machines.

"This is a single device much simpler to program than tens of thousands of GPUs," said Go.

"One of our customers, they're currently using GPUs and they can't get the accuracy they need," added Go. 

"They're very, very excited to try the CS-2 because they can find out very, very quickly if they can get the accuracy they need."

wse2-natalia.jpg

Cerebras Systems product manager for AI Natalia Vassilieva holds the company's WSE-2, a single chip measuring almost the entire surface of a twelve-inch semiconduor wafer. The chip was first unveiled in April, and is the heart of the new CS-2 machine, the company's second version of its dedicated AI computer.

Cerebras Systems

"One of the little known facts about our industry are how few people can actually build big clusters of GPUs, how rare it is," said Feldman.

The skills are "probably resident in a couple dozen organizations in the world," he offered. "To make that performance available with a cloud-based infrastructure […] makes this an obvious choice for extending our reach."

"This is an effort to allow you to work on the machine in a professional cloud in a world-class facility, with a trained set of people and they're building services and offerings to make it easy for users to jump on and solve problems," said Feldman.

Feldman said Cerebras has not modified any of its firmware or software code to make the CS-2 run in Cirrascale's data center. "The Cirrascale cloud software enabled a smooth and seamless integration," Cerebras told ZDNet in an email follow-up.

The partnership for the moment does not include the SwarmX router nor the MemoryX memory store computer, Cerebras told ZDNet.

Asked about financial terms of the partnership such as whether Cerebras would sell the machines outright to Cirrascale or rather lease them or share revenue, Cerebras declined to specify financial terms. CEO Feldman did note to ZDNet in written communications that Cerebras as a company is "creative." 

Editorial standards