Intel aims to be inside your artificial intelligence stack

Intel's master plan is to do for AI what it has done for server processors. The goal is to create a general purpose AI stack so enterprises can broadly adopt machine learning.
Written by Larry Dignan, Contributor on

Video: Nervana Neural Network Processor: Intel's new AI-focused chip

Intel, arguably the biggest ingredient brand ever, wants to be known as the processing brains behind artificial intelligence as well as enabling technologies such as the cloud and Internet of Things.

Think Intel Inside for artificial intelligence.

Intel is transitioning from being a company one step removed the CXOs driving digital transformation to a vendor directly involved with prototyping the future with customers. At its Shift event this week in New York, Intel executives hosted customers, data scientists, and partners to outline what's next.

"The world is Intel's customer," said Intel CTO Amir Khosrowshahi, who joined the company via the acquisition of Nervana, which offers a platform to scale AI deployments. Intel acquired Nervana in 2016, integrated it into its roadmap months later, and now, said it will launch its Nervana Neural Network Processor along with a new class of products.

Read also: Why Intel built a neuromorphic chip | AI training needs a new chip architecture: Intel

Intel's third quarter earnings report isn't likely to show immediate returns on the company's pivot. Data centers and Xeon are expected to carry the day for Intel, which is expected to deliver third quarter non-GAAP earning of 80 cents a share on revenue of $15.73 billion.


Intel's neural network processor (NNP) is designed for broad commercial enterprise use of AI. (Image: Intel)

Stifel analyst Kevin Cassidy said in a research note:

We believe Intel's role in the emerging AI market is understated. Not only are the company's processors still required for a GPU based system, but we believe the AI field is still in its infancy, and multiple solutions will be tested before a clear winner, or winners, emerges. Intel provides multiple AI solutions including its Xeon Phi coprocessors, FPGAs, and solutions developed through acquisitions including Mobileye, Movidius and Nervana.

The goal for Khosrowshahi is simple yet difficult: Create a multi-purpose AI processor. While Google and Apple have developed their own AI processors, Intel is looking to bring AI capabilities to most enterprises.

And, for now, that goal means Intel co-develops with a bevy of players. The Nervana Neural Network Processor (NNP), formerly Lake Crest, was developed along with Facebook. Nervana NNP will be used in health care, social media, automotive, and weather, and it'll incorporate Intel's own intellectual property as well as open source approaches.

Tech Pro Research: How artificial intelligence is taking call centers to the next level

Khosrowshahi explained Intel is in a unique position to take input and collaborate with a wide range of customers. "We operate in a demilitarized zone where we can get feedback and build frameworks," he said.

The Facebook collaboration on Nervana NNP is one example. Intel also works closely with Google, Baidu, and Amazon, and Khosrowshahi noted that he's increasingly talking to C-level executives. The first question is straightforward: What are your problems?

That question is increasingly driving Intel's roadmap, said Lisa Spelman, general manager of Intel's Xeon product line within the data center group. "Every application will have AI and on the data center side we have to unlock more performance for it," said Spelman. "The next wave of AI is about democratization."

Khosrowshahi is in a unique position since he has already fell into a few rabbit holes when creating the Nervana AI platform. To deploy AI properly, Khosrowshahi noted that multiple disciplines have to work together inside an enterprise. "AI has to start at the top and you need depth and breadth," he said. "There needs to be machine learning, AI as well as business knowledge."

Khosrowshahi said Xeon will have increasingly have AI tools built in. Advances in everything from interconnects to storage will also contribute to the general purpose AI cause. Coupled with the acquisitions of Mobileye and Altera, Intel has combined multiple technologies for what'll become its AI stack.

On the storage front, Intel has developed Optane, a system that'll reduce latency. After all, AI will have to touch compute as well as storage in real-time.

Read also: Can Optane SSD DC P4800X allow Intel to keep its lead in the server storage market?

How this AI stack is delivered to the end customer remains to be seen. For many companies, Intel's AI tools will be delivered via cloud service providers. For other firms in financial services, oil, and gas, Intel's AI stack will be used on-premise. Most companies will mix and match AI tools as well as vendors. In other words, Nvidia and Intel will often ride shotgun.

Related stories

Editorial standards