X
Innovation

Microsoft opens its 'BrainWave' AI-on-FPGA service to external testers

Microsoft's project to run fast AI tasks on FPGAs in Azure is starting to come to fruition, with testers starting to get access to the first pieces now.
Written by Mary Jo Foley, Senior Contributing Editor

Microsoft's Project BrainWave -- its effort to use Field Programmable Gate Array (FPGA) technology to provide fast AI processing in Azure -- is now in preview in a couple of different forms.

brainwavefpgaai.jpg

On May 7, Day 1 of its Build 2018 developer conference, Microsoft execs said they are making available a preview of Azure Machine Learning Hardware Accelerated Models powered by Project Brainwave in the cloud. This is the first step in making available to external customers Microsoft's FPGA processing for AI workloads -- something execs said last year would be coming in 2018.

Microsoft also is making available a limited preview of Project Brainwave "on the edge," officials said today. The edge devices, in this case, are on-premises servers that can act as Azure IoT Edge devices. Dell and Hewlett Packard Enterprise are the first partners participating in this limited preview.

Brainwave will enable Azure users to run complex deep-learning models at souped-up levels of performance. Over time, Microsoft's goal is to make more AI-influenced services possible to deploy in the cloud, including computer vision, natural-language processing and speech.

CNET: Build 2018: Livestream, start time, what to expect

Microsoft has been using Intel FPGAs to improve performance and efficiencies of Bing and Azure for the last few years. FPGAs are chips that can be custom-configured after they're manufactured.

Microsoft researchers have been doing work in the FPGA space for more than a decade. Microsoft has added FPGAs to all of its Azure servers in its own datacenters, as well as implementing FPGAs in some of the machines that power Bing's indexing servers as part of its Project Catapult efforts. Microsoft's Azure Accelerated Networking service, which is generally available for Windows and in preview for Linux, also makes use of FPGAs under the covers.

Microsoft is not the only company looking to FPGAs in its cloud datacenters; both Amazon and Google are using custom-built silicon for AI tasks.

CNET: Microsoft's Nadella thinks AI can improve tech for the disability community | Kinect gets another shot at Microsoft Build | Microsoft's Project Brainwave brings fast-chip smarts to AI at Build conference

Amazon already offers an FPGA EC2 F1 instance for programming Xilinx FPGAs and provides a hardware development kit for FPGA. Google has been doing work around training deep-learning models in TensorFlow, its machine-learning software library and has built its own underlying Tensor Processing Unit silicon.Now

In other AI-related news, Microsoft announced a $25 million, five-year program called AI for Accessibility. Microsoft is looking for developers, non-governmental organizations, academics and researchers interested in providing solutions for the more than 1 billion people with disabilities worldwide. Via the program, Microsoft will provide grants, technology, and AI expertise "to accelerate the development of accessible and intelligent AI solutions."

Editorial standards