Trailblaizing end-to-end AI application development for the edge: Blaize releases AI Studio

You might not know it by reading this news, but Blaize is an AI chip company. Blaize is now boldly going where none of its ilk has gone before, releasing a software development product. And that's not the only reason AI Studio is interesting
Written by George Anadiotis, Contributor

Blaize is among the many contenders in the AI chip space, sprung out of the renaissance brought on to hardware research and development by the proliferation of machine learning workloads. Although its architecture is interesting in its own right, the occasion to connect was something different.

Blaize just launched its AI Studio solution, which it touts as the industry's first open and code-free software platform to span the complete edge AI operational workflow, dramatically reducing deployment time, complexity, and cost.

That's a lot of promises for an IDE, coming from an AI chip company, no less. ZDNet connected with Blaize CEO Dinakar Munagala and VP R&D Dmitry Zakharchenko to explore AI Studio, its potential and philosophy, and where it fits in Blaize's strategy.

AI on the edge, end-to-end

Munagala and his co-founders, who were previously building graphics chips for Intel, started Blaize almost nine years ago, "pretty much from my spare bedroom" as per Munagala. They left because they wanted to build a new processor for emerging workloads -- a move ahead of its time, and ahead of Intel itself.

Blaize raised capital initially from angel investors and then from strategic investors and financial investors. To date, Blaize has raised about $87 million in equity financing, and has a worldwide team of 300+ people. Its focus is on AI on the edge, and inference. In other words, not on training models in the data center, but rather in deploying models in operation in the real world.


Blaize produces AI chips for edge applications

Munagala mentioned that Blaize's AI chip architecture, called graph streaming architecture, makes it very suitable for edge and inference. It means Blaize's chips work with low latency, low power, and reduced memory and bandwidth requirements. Another key aspect, Munagala went on to add, is programmability:

Programmability is one of our founding principles. The rate at which algorithms are changing is rapid, compared to the time it takes to develop a new chip architecture. If you go specifically build a fixed function architecture for a certain AI workload, before it comes out, things have changed. To enable our customers, we've chosen to invest in software deeply and that's what we've done.

If you look at the multi-trillion dollar impact AI is projected to have on the global economy, a big part of it is not going to be in the data center, said Munagala. Cars, factories, surveillance, and medical research are just some of the use cases. What they have in common is that, where those solutions are relevant, there are no armies of data scientists to develop and deploy them.

This is what Blaize is trying to address with the AI Studio: the AI development lifecycle, end-to-end, including MLOps. Munagala used the Wintel metaphor to convey this idea. The PC revolution came about due to affordable computing plus ease of use and office productivity tools, he said, and Blaize wants to do the same for AI applications.

Software, Software, Software

This is quite ambitious. So how does AI Studio live up to the ambition? Zakharchenko, who described AI Studio as his brainchild, noted that this is work he and the team behind it have been busy with for the last couple of years.

AI Studio is an IDE of sorts, but it's not aimed exclusively, or even primarily, at software developers and data scientists or engineers. This is the aspect Blaize emphasizes the most. AI Studio introduces features such as an intelligent assistant that draws from a knowledge base to help non-technical people get involved without needing to write code.

Also: Low-code and no-code development is changing how software is built - and who builds it 

It also introduces integration with machine learning model marketplaces, which enable model reuse and data sharing to speed up application development. It comes in different flavors, and can run in the cloud or on premises. And, crucially, it supports open standards such as ONNX, which means it can be used beyond Blaize's own chips.


Blaize believes software is the missing link to boost the development of more AI applications for the edge

Zakharchenko said Blaize looked at tooling in general, and what they found was that the industry has been pursuing different types of tools for accelerating AI. Blaize chose to focus on AI for the edge, aiming to facilitate going from the lab to production.

There is almost nobody playing in this space because it's really been all about cloud and this market is underserved, noted Zakharchenko. To the best of our knowledge, that seems to be the case -- we are not aware of something like AI Studio, aimed specifically at the edge market.

The question then is, why did Blaize choose to address it, and what do they stand to win by doing so? Zakharchenko was adamant: the barrier to adoption of Blaize's chip -- or any AI chip for that matter -- is huge because of software: "You can have the best chip in the world, but not having software is a key deterrent".

Not just for developers

Zakharchenko said they studied existing IDEs extensively, learned and borrowed from them, but they chose to build AI Studio from scratch. A key goal they wanted to serve was to bridge the gap between two different personas involved in AI application development, the data scientist or engineer and the domain expert.

The idea is that domain experts should be able to use AI Studio, while people who want/need to be closer to the code can use the Studio, and they can also use an SDK Blaize has previously made available. Source code is embedded in the applications generated by AI Studio, and Blaize says the integration is open and supportive of the lower-level developers.

Some key aspects of the offering are access to, and edge-specific optimizations on, off-the-shelf machine learning models. Access to models comes via integration with existing marketplaces. Zakharchenko mentioned that AI Studio does more than search for free to use models in those. It also ranks them using different criteria, like as a Google for AI models. Optimizations are done leveraging Blaize's expertise and proprietary technology.


Blaize claims a few industry firsts for its AI Studio

A couple of important questions are whether AI Studio can be used with hardware other than Blaize's chips, and whether Blaize intends to market it in its own right. The answers are yes, and not really, respectively.

AI Studio supports the ONNX standard, so generated models can in theory be deployed on top of any processor. In practice, however, Blaize is focused on using AI Studio to promote its own hardware, for which it is optimized in a number of ways, rather than market it to a broader audience. Expectedly, Blaize reports its customers are the first ones to adopt AI Studio, and are quite happy with it.

AI Studio is a rather unique and interesting product. Not just because it's a software product coming from a hardware vendor, but also because of its general approach. This is a point we've made for Nvidia, too -- its success today is largely owed to the software ecosystem it has built around its hardware over the years.

In a way Blaize is punching above its weight here, taking on an issue that nobody else in the industry seems to have taken on in this way. It looks like AI Studio is the result of a well-thought of approach, and lots of work. What remains to be seen is whether this will pay off.  

Editorial standards