X
Innovation

Intel says it sold more than $1B in AI chips in 2017

The chipmaker also shared more information about its product roadmap, including next-generation Xeon processors.
Written by Stephanie Condon, Senior Writer

Intel collected more than $1 billion in revenue in 2017 from Xeon processors used for artificial intelligence in the data center, the company said Wednesday.

Navin Shenoy, EVP and GM of the Data Center Group at Intel, announced the milestone during Intel's Data Centric Innovation Summit, where he also shared more information about Intel's product roadmap. All told, Intel's data-centric businesses amount to a market opportunity that should be worth $200 billion in 2022, Shenoy said. Previously, Intel estimated its TAM (total addressable market) at $160 billion.

The technology industry, Shenoy said, needs to view data holistically, "including how we move data faster, store more of it and process everything from the cloud to the edge."

The remarks were in keeping with the approach to AI that Gadi Singer, VP of Intel's Artificial Intelligence Products Group, laid out to ZDNet. Intel, he said, is building a successful AI business by investing in Xeon as "the bedrock of AI," building a diverse chip portfolio, and investing in systems integration.

As far as systems integration goes, Intel announced Wednesday that it's expanding its connectivity portfolio with a new SmartNIC product line, code-named Cascade Glacier. Intel also announced the first shipments of its Optane DC persistent memory modules, starting with a symbolic hand-delivery to Google. On stage at Wednesday's summit, Shenoy gave the first unit to Bart Sano, Google's vice president of Platforms.

Shenoy also laid out the next-generation roadmap for the Intel Xeon platform, which includes:

  • Cascade Lake, a future Xeon Scalable processor based on 14nm that will introduce Optane DC persistent memory and DL Boost with Vector Neural Network Instruction (VNNI).
  • Cooper Lake, a future Xeon Scalable processor that is based on 14nm technology. It will introduce a new generation platform with significant performance improvements, new I/O features, new DL Boost capabilities (Bfloat16) that improve AI/deep learning training performance, and additional Optane DC persistent memory innovations.
  • Ice Lake, a future Xeon Scalable processor based on 10nm technology that shares a common platform with Cooper Lake.
Editorial standards