The company that started out making graphics chips, and to uninformed observers would have appeared to be making a weird deviation in the world of artificial intelligence, is now bringing it all back together as graphics and AI merge.
The state of the art is at such a point that the company believes it can claim to create photo-realistic images from sketches, which, besides a few boundary issues with different elements, looked realistic enough.
"We are doing a lot of work in AI-inferred image generation. It is unquestionably the future," Nvidia CEO Jensen Huang told ZDNet on Tuesday.
But focusing merely on the pretty consumer side of the business is to ignore its datacentre ambitions that should leave executives at companies like Intel sweating bullets. Nvidia believes that data science and the use of neural networks needs to make use of the massively parallel hardware it offers, and that the days of being able to get away with CPU-run neural networks are fading fast.
"My belief is that a lot of the inference, even still today, is still being run on CPUs -- I think there is some offline batch inference on Volta or legacy Pascal, but I think the vast majority of inference is running on CPUs," Nvidia general manager and vice president of accelerated computing Ian Buck told journalists on Wednesday.
"What we are seeing now is that the networks that people are deploying can no longer run on CPUs."
Using the example of a voice search to a search engine, Buck pointed out the different networks handling that request: A denoiser, an acoustic model, and a language model to process the input voice; another network to scan and process the web pages returned to the user; and a third to return a vocalisation of the result.
"That can't run on a CPU -- it simply isn't possible. If you tried to do that in real-time, it would take seconds, if not unusable. They have to execute all that in milliseconds," Buck said.
"So that's the shift that we are seeing, the networks are not able to run on the CPUs to get the accuracy or latency requirements."
"In the end, that's the person we need to help, the data scientist who is obviously under enormous pressure to drink from this firehose of data that they have been collecting, and actually making business improvements," Buck told ZDNet.
"I think distributed data analytics and data science is the big next chapter for the enterprise, and one technical barrier might be turning the corner on the networking, I'm starting to see that with 25G and 100G, so I think as people see what they can do, I think it'll definitely catch on and move quickly."
Nvidia's $6.9 billion purchase of Mellanox should signal how seriously the company is taking its push into the datacentre. It already had its own compute stack, and should the purchase gain approval, will add networking and interconnects to its bag.
When discussing the purchase, Huang pointed to the increase of east-west traffic in the datacentre due to technologies like containers and neural networks, as well as the size of data being analysed.
"Both of these conditions cause the network to be the bottleneck, both conditions, and during that time when Moore's law is slowing down, the software stack, the networking software stack, has to be moved onto the fabric as much as possible," the Nvidia CEO said.
"The CPU is now too rare a resource, so you have to offload any work that you can, and Mellanox is world class at CPU offloading, they take the entire stack of networking and they run it on the smart NIC.
"In the future, more and more and more of that will happen, so the network is going to become intelligent."
According to Huang, the computing fabric will extend beyond the node and into the network.
"The whole thing is going to be one large computer," he said.
At the same time as it is trying to get enterprises at the bottom end of the market onto GPUs, Nvidia is offering 1,280 GPU RTX Server Pods. The sort of machine that is being pitched at telcos to allow them to offer services like GeForce Now. SoftBank in Japan and Korea's LG Uplus are already signed up for Pods.
From above and below, Nvidia is determined to find a way into the datacentre.
Disclosure: Chris Duckett travelled to GTC in San Jose as a guest of Nvidia