Intel on Tuesday officially launched "Ice Lake," its new third-generation Xeon Scalable processor. The 10 nanometer-based CPU, delivering up to 40 cores per processor, is the foundation for Intel's data center platform.
The chip is designed for workloads spanning a range of markets, from the cloud to the network and the edge. Intel says every "top tier" cloud service provider will be offering services based on Ice Lake. It's launching the chip with more than 50 OEMs building more than 250 servers based on the platform.
The third-generation Xeon platform "is the most flexible and performant in our history," Navin Shenoy, EVP and GM of Intel's Data Platforms Group, said in a statement Tuesday.
The launch comes at a pivotal time for Intel. The data center market is increasingly competitive, with AMD eyeing Intel's overwhelming market share, while Intel's own customers build their own chips. Meanwhile, after a number of delays in its product roadmap, CEO Pat Gelsinger -- who assumed the leadership role in January -- has committed Intel "to a steady cadence of leadership products" using both its own and external manufacturing.
Shenoy said in his statement that "Intel is uniquely positioned with the architecture, design, and manufacturing to deliver the breadth of intelligent silicon and solutions our customers demand."
Compared to the prior generation, Ice Lake delivers an average 46 percent performance increase on popular data center workloads, according to Intel. The processors also add Intel SGX (Software Guard Extensions) for built-in security, as well as Intel Crypto Acceleration. It includes Intel's DL Boost for AI acceleration.
The platform supports up to six terabytes of system memory per socket, up to eight channels of DDR4-3200 memory per socket, and up to 64 lanes of PCIe Gen4 per socket.
More on the built-in features:
AI acceleration: Intel says that with hardware and software optimizations, Ice Lake delivers 74 percent faster AI performance compared with the prior generation. Compared to the third-generation AMD Epyc 7763, Intel says the platform delivers up to 1.5 times higher performance across a broad mix of 20 popular AI workloads. Compared to the Nvidia A100 GPU, it says it delivers up to 1.3 times higher performance on a broad mix of 20 popular AI workloads.
Built-in security: Intel SGX, and security in general, is a major selling point for Intel's Ice Lake. The technology can protect as much as 1 terabyte of code and data in private memory areas called enclaves.
Meanwhile, Ice Lake chips also feature cryptographic acceleration that promises to let the chip deliver both security and performance. This will be especially important for business customers -- such as online retailers processing millions of customer transactions per day -- that have to protect sensitive information without slowing down performance.
Up until now, security hasn't a top priority for customers, according to Rebecca Weekly, Intel's Senior Principal Engineer and Senior Director of Hyperscale Strategy and Execution. Customers, she said, were primarily concerned about compromising on performance.
Ice Lake's built-in security features, she said, change that equation -- and align with growing consumer awareness about the importance of digital security.
"As people are becoming more aware of privacy, the desire to keep their data their own, and not have their experiences potentially manipulated... we're going to see a massive shift" in favor of secure environments, Weekly said to ZDNet. "We're empowering users to take back the security of the cloud and the security of their data, and there's no excuse not to do it, because it's much more performant."
Ice Lake's different markets:
Cloud: More than 800 cloud service providers globally already run on Intel Xeon Scalable processors, and "all of the largest cloud service providers" plan to offer Ice Lake-powered cloud services in 2021, Intel says.
Network: Intel produces network-optimized "N-SKUs" to support diverse network environments. The new processors deliver on average up to 62 percent more performance on a range of broadly-deployed network and 5G workloads over the prior generation, Intel says. More than 15 major telecom equipment manufacturers and communications service providers are readying POCs and networking deployments with Ice Lake.
Enterprise, HPC: There are more than 50 OEMs building more than 250 servers based on the Ice Lake platform. More than 20 HPC labs and HPC-as-a-service environments are leveraging the new processors.
The edge: Ice Lake delivers up to 1.56 times more AI inference performance for image classification than the prior generations, Intel says. This makes it suitable for AI, complex image or video analytics, and consolidated workloads at the edge.
All told, Intel has already shipped more than 200,000 Ice Lake units in Q1 of 2021. Here are some of the partners supporting Ice Lake:
HPE is supporting Ice Lake across the following eight products: industry-standard servers (HPE ProLiant DL360 Gen10 Plus, HPE ProLiant DL380 Gen10 Plus, HPE ProLiant DL110 Gen10 Plus for the telco market), ruggedized systems for the edge (HPE Edgeline EL8000 Converged Edge systems, HPE Edgeline EL8000T Converged Edge systems), the HPE Synergy 480 Gen10 Plus for hybrid cloud environments, the HPE Apollo 2000 Gen10 Plus system for HPC and AI workloads, and the HPE Cray EX supercomputer.
Cisco is introducing the following Unified Computing System (UCS) server models: Cisco UCS B200 M6, C220 M6, and C240 M6. The new servers feature native integration with Cisco Intersight, a hybrid cloud operation platform. They offer a wide range of workloads, including VDI, databases, AI and machine learning, and big data.
Dell Technologies last month announced plans to support Ice Lake with its latest portfolio of Dell EMC PowerEdge servers. It introduced new rack, modular, edge, and GPU-optimized servers third-generation Xeon and third-generation AMD Epyc processors.
Lenovo launched a series of ThinkSystem server updates and teased new rugged edge computing systems based on Ice Lake processors. The servers are designed for workloads like HPC, artificial intelligence, cloud, analytics, and VDI. The ThinkSystem and ThinkAgile lineups include ThinkShield, which secures firmware, processors, and subsystems.
Supermicro has a lineup of more than 100 application-optimized systems that feature third-generation Xeon processors. The systems include Hyper, SuperBlade, the Twin Product Family (BigTwin, TwinPro, and FatTwin), Ultra, CloudDC, GPU, and Telco/5Gand Edge servers. Supermicro's new X12 systems are already deployed in volume at organizations including Osaka University and DISH.