Arm's acquisition by Nvidia has been rumored for a while, and now, it has been officially confirmed. This is a significant and well-tuned move for both sides. One that has been long-time coming, in fact. We review the steps leading to this outcome, and what this means for the AI chip market.
This is the second high-profile acquisition for Nvidia in 2020, following the acquisition of Mellanox in April. The two are complementary, as they are both fundamental for Nvidia's plan to acquire and maintain a leading role in AI workloads in the data center and beyond.
As we have noted, GPUs are a boon for machine learning workloads. Nvidia has also taken note and acted upon this early and successfully. This has effectively resulted in an additional market and a significantly growing one for that matter. Machine learning is eating the world, alongside the cloud.
Machine learning workloads are a good match for the cloud. For starters, the training phase for machine learning algorithms is quite demanding in terms of compute. For many organizations, it does not make sense to purchase the kind of infrastructure required for those workloads, and this is where the cloud comes into play.
Besides on-demand utilization and elasticity, there are more reasons why sending machine learning workloads to the cloud makes sense in many cases. AI workloads are better executed by specialized hardware, which is why Nvidia has been expanding its footprint in data centers.
Mellanox's acquisition was a piece in this puzzle, as Mellanox's technology enables better networking in the data center for Nvidia's chips. This is a substantial benefit. The fact that Mellanox also provided a solid contribution to Nvidia's recently announced Q2 earnings does not hurt.
But it's the bigger picture emerging from those earnings that is really important here: Nvidia beat Q2 estimates with record data center sales. Nvidia's data center revenue came to $1.75 billion, up 167% from a year earlier. This is yet another indication that the data center is a growth engine for Nvidia.
Arm's acquisition also plays to that tune. The data center AI workload pie is growing, and there is growing competition for a piece of it from both Intel emerging startups. In the face of this competition, Nvidia is after a double bottom line: better performance and better economics.
This was a key theme in the recent unveiling of Nvidia's Ampere AI chip. It was also a key theme in the existing collaboration with Arm. Recently Nvidia added support for Arm CPUs. Although Arm processor performance may not be on par with Intel at this point, its frugal power needs make it an attractive option for the data center.
As Scott Fulton III notes in his in-depth coverage of Arm processors, the prospects for Arm in servers are rising. A testament to this fact: Last month, a Fujitsu Arm-powered supercomputer named Fugaku seized the No. 1 spot on the semi-annual Top 500 Supercomputer list.
Fulton III goes on to add that, of all the differences between an x86 CPU and an Arm processor, the only that probably matters to a data center facility manager is that Arm chips are less likely to require an active cooling system. Hence, they are more economical. The importance of economy of scale for data centers cannot be overstated, but this is not the only reason why the Arm acquisition makes sense for Nvidia.
The data center is the latest expansion for Arm CPUs. Traditionally, Arm's strength has been beyond the data center. The fact that they are used by Qualcomm -- its Snapdragon models are utilized by an array of mobile phones -- testaments to that.
Nvidia has been adamant about its intention to expand beyond the data center, too. In a 2019 Q3 earnings call with analysts, following Nvidia's introduction of its EGX compute platform for edge AI, CEO Jensen Huang noted:
"This quarter, we have laid the foundation for where AI will ultimately make the greatest impact. We extended our reach beyond the cloud, to the edge, where GPU-accelerated 5G, AI, and IoT will revolutionize the world's largest industries. We see strong data center growth ahead, driven by the rise of conversational AI and inference."
"Nvidia doesn't design CPUs, we have no CPU instruction set, Nvidia doesn't license IP to semiconductor companies, so, and in that way, we're not competitors. We have every intention to add more IP tools and also unlike Arm, Nvidia does not participate in the cell phone market," Huang noted in a statement following Arm's acquisition.
The pieces of the puzzle have been in the making for a while now. As we have noted, Nvidia's dominance is based on more than hardware. A software ecosystem around Nvidia's AI chips has been paramount to its success. Nvidia has been making a consistent effort of keeping its libraries up to date and upgrading them in terms of performance.
The fact that Nvidia has been working with Arm for a while now probably means we can expect the software side of things in terms of Arm processors' support to evolve smoothly, too. With Arm's acquisition, Nvidia continues to execute on its plan, while posing an ever greater challenge for those who are working on new architectures.
Challengers will have to not only beat Nvidia on performance but also on the economics and the ecosystem aspects, both of which seem to just have gotten an upgrade.