X
Innovation

Google releases Cloud TPU beta, GPU support for Kubernetes

Google said a limited quantity of TPUs are available today.
Written by Natalie Gagliordi, Contributor
tpuboardheroforwebonlyfinal.jpg

The Google Cloud TPU delivers up to 180 teraflops of computing power.

Google Cloud announced Monday that Cloud TPUs are available in beta on Google Cloud Platform.

Short for Tensor Processing Unit, TPU's are designed for machine learning and tailored for Google's open-source machine learning framework, TensorFlow. The specialized chips can provide 180 teraflops of processing to support training machine learning algorithms, and have been powering Google datacenters since 2015.

"We designed Cloud TPUs to deliver differentiated performance per dollar for targeted TensorFlow workloads and to enable ML engineers and researchers to iterate more quickly," Google wrote in a Cloud Platform blog.

"Over time, we'll open-source additional model implementations. Adventurous ML experts may be able to optimize other TensorFlow models for Cloud TPUs on their own using the documentation and tools we provide."

Google said a limited quantity of TPUs are available today with per-second billing at the rate of $6.50 per Cloud TPU per hour.

The company also announced that GPUs in Kubernetes Engine are in beta and available today in the latest release of the Kubernetes Engine.

According to Google, GPUs in the Kubernetes Engine will help speed up compute-intensive applications like machine learning, image processing and financial modeling. Both the NVIDIA Tesla P100 and K80 GPUs are available as part of the beta, and V100s are said to be on the way.

PREVIOUS AND RELATED COVERAGE

GPU killer: Google reveals just how powerful its TPU2 chip really is

Google's second-generation Tensor Processing Units Pods can deliver 11.5 petaflops of calculations.

TPU is 15x to 30x faster than GPUs and CPUs, Google says

Google shared details about the performance of the custom-built Tensor Processing Unit (TPU) chip, designed for machine learning.

Google I/O: Custom TPU chip amplifies machine learning performance

The Internet giant revealed it's built its own hardware that's been powering data centers for the past year with significant results.

READ MORE ON AI

Editorial standards