Intel's AI chief sees opportunity for 'massive' share gains

Naveen Rao, co-founder of AI chip maker Nervana, which Intel bought in 2016, explains how the company will storm the market in 2019 with chips and software.
Written by Tiernan Ray, Senior Contributing Writer

"We want to gain massive market share," is how Naveen Rao, head of artificial intelligence efforts at chip giant Intel, describes the company's ambitions.

"We're going to come to market full-force next year." 
Rao made the comments in an interview with ZDNet ahead of Intel's AI DevCon in Beijing, China, which kicks off Wednesday. It is the third event Intel has hosted this year, starting with a San Francisco gathering in May and one in Bangalore, India in August.

Intel began the series this year to reach more of the data scientists who do the fundamental work of designing and training the neural networks.

Also: Google says 'exponential' growth of AI is changing nature of compute

The marketplace for AI silicon today is split between Intel and Nvidia, but things won't stay static. Intel's goal is to both take share from Nvidia, in the "training" phase of AI, while retaining the dominant share Intel has in "inference."

Training is when the neural network is first developed. Nvidia's GPUs are the main choice for that work, but Rao thinks Intel can bring competition with "Spring Crest," the code-name for the "Nervana Neural Network Processor," the chip that emerged out of the work of Nervana Systems, the chip company that Rao co-founded and ran, which was purchased by Intel in 2016.


Intel's AI chief, Naveen Rao.

"Somewhere around 80 percent or so is theirs [Nvidia], and we have some fifteen-odd percent of the training market, along with other things out there, and that's really where we're on the attack," said Rao. "So, we want to gain massive market share; We're the minority player there and we have a huge opportunity."

In inference, when the network is put into service to answer questions, the situation is the reverse: Intel's chips in the data center dominate with 80 percent market share or better.

Also: AI Startup Cornami reveals details of neural net chip

The market for chips for training is growing 20 percent to 30 percent per year, said Rao, which is not as fast as the 60 percent to 70 percent, but still a "massive" rate of growth, observes Rao.

Rao is mindful that a raft of startups are targeting inference, including several written about by ZDNet lately, such as Cornami, Flex Logix, and Efinix. The inference market for chips is a moving target. "So we need to make sure we keep pace with that, and keep that 80 to 90 percent, with all the new technology," he says. (All marketshare figures are drawn from data compiled by research firms Gartner and IDC, says Intel.)

"That's not a given with a CPU, right? Today it is. It is on a CPU. But in five years there's going to be more solutions and we have to make sure that at least 80 percent of that market that grew at 60 percent, year on year, is still on Intel."

Intel's message in both training and inference is that the variety of chips the company brings to market afford developers of neural network algorithms different tools to get things done. There is the CPU, of which Intel will be discussing at the show its "Cascade Lake" Xeon processor update. Intel also has chips from its acquisition of Movidius, for "vision processing," and all the programmable chips from its acquisition of Altera.

"What's happening is, Moore's Law is chugging along, and we're shoving more transistors on a die, but the demand for compute is actually outpacing Moore's Law by a huge amount," explains Rao, "It's a Super Moore's Law."

Also: AI startup Flex Logix touts vastly higher performance than Nvidia

"I think the only way to really keep pace with that, it's not one chip, your per-chip computations matter, but it's actually how you go across chips that really matters." Rao notes that the diversity of the various startups is "really cool," but the practical matter of shipping silicon matters. "It's one thing to build 10 of these [chips], it's another thing to do 100,000 of them and have someone actually deploy them."

"I think startups are going to really struggle there."

The focus of DevCon is to bring more people into the Intel fold who are not chip programmers per se: the data scientists whose focus is the various frameworks such as TensorFlow, PyTorch, fastai, Caffe, etc., used to develop the neural networks. The meeting will showcase how data scientists can work with two software tools across Intel's portfolio of chips: OpenVINO, which optimizes convolutional neural networks for vision-based applications; and nGraph, a C++ library that abstracts the underlying hardware details. NGraph came out of work done at Nervana before the company was acquired.

Also: Chip startup Efinix hopes to bootstrap AI efforts in IoT

Both software kits allow data scientists to "spit out an optimized neural network," says Rao, because "all these chips have different constraints and capabilities."

Of course, Nvidia's "CUDA" software has the perception of being a tremendous asset for Nvidia in terms of software for neural network development. But Rao argues the dominance of CUDA is not what it would appear to be.

CUDA "is actually not something that anyone in the AI community really codes to," observes Rao. "CUDA was built as a way to try to make GPUs more general, and that strategy worked great for them [Nvidia], but it is largely not relevant in terms of a developer mindset right now in AI."

"I would argue in the AI space, there is not actually a mature [software] stack."

Previous and related coverage:

Early AI adopters report big returns

New study shows artificial intelligence technology is paying off, but organizations face challenges.

Oracle introduces new enterprise digital assistant

Going beyond typical chatbots built for a single purpose, the Oracle Digital Assistant can be trained to support domain skills from multiple applications

AI delivering returns for enterprise early adopters, but not industries created equal

Deloitte's annual AI survey reveals a bit of realism, cybersecurity worries and a 17 percent median return on investment.

Machine learning now the top skill sought by developers

SlashData's latest survey of 20,000 developers identifies machine learning and data science are the skills to know for 2019.

What is deep learning? Everything you need to know

The lowdown on deep learning: from how it relates to the wider field of machine learning through to how to get started with it.

What is digital transformation? Everything you need to know

Digital transformation: what it is, why it matters, and what the big trends are.

AI and Internet of Things will drive digital transformation through 2020

Research study reveals IoT, AI and synchronous ledger tech (blockchain) priorities.

Related stories:

Editorial standards