X
Innovation

NVIDIA morphs from graphics and gaming to AI and deep learning

NVIDIA, originally focused on semiconductor products for video graphics rendering, is transitioning into an AI super-computing company. How does that manifest and doe sit make sense?
Written by Andrew Brust, Contributor

Maybe you've heard of the x86 central processing unit (CPU) architecture that powers most PCs and servers today. But once upon a time in PC land, Intel made a bundle of cash selling x87 math co-processor chips to accompany the x86 products. These chips excelled at, and accelerated, floating point math operations and helped make PCs much faster at performing certain tasks that were hot and relevant back then, like recalculating spreadsheets.

387, redux
But spreadsheets are old hat now, and math co-processor functionality eventually got integrated into the CPU itself, forcing the math x87 line to dry up. But Artificial Intelligence (AI) has, in a way, brought math co-processors back in vogue, by utilizing graphics processing units (GPUs) in a similar supporting role. As it turns out, the kind of mathematical capabilities required to render high-resolution, high frame-rate graphics are also directly applicable to AI.

Specifically, the work required to train predictive machine learning models, especially those based on neural networks and so-called deep learning, involves analyzing large volumes of data, looking for patterns and building statistically-based heuristics. The more training data used, the more accurate the predictive models become. GPUs are great for this type of work, despite the fact that it's not really about graphics or video.

That's why NVIDIA, a company originally focused on GPUs and chipsets for video adapter cards and game consoles is rapidly morphing into an AI company.

Health AI
For example, NVIDIA is now working with the Center for Clinical Data Science (CCDS) in Cambridge, Massachusetts, to employ AI in the service of assisting radiologists in reading and interpreting x-rays, MRIs, CAT scans and the like. The company's DGX systems, based on its Volta AI architecture, are being used by CCDS radiologists to speed up the process of analyzing medical imagery and finding abnormalities and patterns in them.

CCDS just took delivery of the world's first NVIDIA DGX-1 supercomputer in December of last year and has already successfully trained machine learning models to do work not only in the sphere of radiology but also in cardiology, ophthalmology, dermatology and psychiatry. CCDS will soon be using a DGX Station -- an AI-specialized desktop workstation -- for medical AI work as well.

Meanwhile, back at the plant

uav-inspecting-transmission-line.jpg

UAV (drone)-based industrial inspection

Credit: NVIDIA

NVIDIA's DGX technology is being deployed not just in medicine, but in a variety of industrial contexts. For example, the company has teamed with Avitas Systems, a venture backed by General Electric, in the service of drone-assisted industrial inspection. This work involves the physical inspection of industrial infrastructure, including flare stacks and heated gas plumes.

Drones can perform inspections in conditions that would be lethal to humans; NVIDIA explains that flare stacks must be shut down for days before they become cool enough for a human inspector to approach. Such multi-day shut downs involve huge production costs and drone-based inspection saves on those costs.

Evergreen
But drone-based inspection requires real-time intelligent guidance based on readings picked up by the drones' sensors (including temperatures encountered and what the drone "sees"). That intelligent guidance is only made possible possible by AI, and that's where the DGX technology comes in. Interestingly, because of all this real-time processing and given the super-human nature of the work, there's an element to drone inspection that parallels gaming. That's a pretty cool connection of old and new.

Here's another one: Because Avitas Systems is a GE venture, it uses GE Predix, which is a predictive analytics platform that integrates with GE Historian. I've written about GE historian technology before, but I did so more than five years ago, when its applications were mostly limited to preventive maintenance. That Predix can now support downstream drone-based inspection shows how useful AI is in its industrial applications...and how much value it's adding to the more rote data collection that has been in place for quite some time.

Detour or destination?
So NVIDIA, a graphics- and video-focused company founded nearly 25 years ago, is reinventing itself as an AI company in the present tense. That's a great way to stay relevant, but is it orthogonal? After some pondering, I've decided it's not. Not only is math co-processing common to both disciplines in terms of underlying technology, but both offer future-facing technology that can be aimed at rendering immersive experiences and simulations.

Plus, NVIDIA has corroboration from its competitors in making this pivot. AMD's in the game too with its Radeon Instinct product, and Intel's Xeon Phi processors are relevant to machine learning and AI as well. Data, analytics and AI are providing the momentum for almost everything in the computing world. Why shouldn't the semiconductor companies, who are critical to computing's infrastructure, align with that trend? It's just common sense.

Editorial standards