X
Innovation

Nvidia says cloud computing revenue will reach $1bn in a matter of years

The chip maker says Big Data is driving demand for high-power processors.
Written by Charlie Osborne, Contributing Writer
screen-shot-2015-06-01-at-09-06-55.png

Nvidia says increased demand for Big Data collection and analysis will prompt a spike in cloud computing revenue within the next two to three years.

As reported by Reuters, the chip maker's CEO Jen-Hsun Huang told reporters ahead of the Computex conference in Taipei that cloud computing is the firm's fastest-growing area, and revenue is expected to reach $1 billion in the next few years.

The increase in demand for processors and graphics chips able to process and run data-heavy applications is the result of Big Data analysis growth. As the enterprise utilizes Big Data more fully within business operations, chips capable of coping with high-powered computing and cloud applications necessary to mine Big Data are required -- which in turn will continue to boost Nvidia's profit line.

In Q1 2015, Nvidia reported net income of $134 million, or 24 cents per share.

In February, Huang said the company's push into the enterprise market, propelled due to its partnership with VMware, has allowed it the chance to reach up to 80 percent of corporations.

Nvidia's graphic processor units (GPUs) are used to power computers and data centers which handle data-heavy tasks, and Nvidia's new partnership is expected to accelerate the firm's reach into the lucrative corporate market. Many enterprises use virtualization software supplied by VMware, and Nvidia's Tesla processor and GRID platform can assist the enterprise in high-powered computing virtualization tasks.

In related news, at the end of May Nvidia launched the latest product in the GTX family, the GeForce GTX 980 Ti. The new GPU is designed with gamers in mind and features 6GB RAM, CUDA cores able to drive games at 4K resolution and support for Microsoft's DirectX 12 graphics application.

Read on: Nvidia deep dives into deep learning with Pascal, Titan X GPUs

Editorial standards