X
Innovation

How Microsoft briefly ran Azure from an undersea datacenter

Microsoft thinks its Project Natick submarine datacenters could be the answer to cleaner, low latency cloud services in the future.
Written by Liam Tung, Contributing Writer
microsoftprojectnatick770x521.jpg

Microsoft has completed a 105-day trial of an undersea datacenter sealed in a steel pod with an 8ft (2.43m) diameter.

Image: Microsoft

Microsoft is exploring undersea datacenter capsules designed to last five years without any human maintenance.

Mobile hardware might not be in Microsoft's future but the cloud definitely is and the company has revealed it is thinking of new ways to cut the cost of delivering those services to major cities by locating its datacenters under water along coastlines.

Microsoft revealed Project Natick at the weekend, marking the completion of a 105-day trial of a datacenter sealed in a steel pod with an 8ft (2.43m) diameter.

The pod was placed 30ft (9.14m) underwater off the Californian coastline and even ran commercial data-processing projects from Microsoft's Azure cloud computing service, according to The New York Times.

Microsoft named its first experimental undersea pod Leona Philpott after the popular Xbox character from its game, Halo.

As Microsoft explains on the project's website, it's still early days for the concept but it hopes the trial will help it "understand the benefits and difficulties in deploying subsea datacenters worldwide".

Obviously, being deployed under water, maintenance is likely to be a concern should it run into a server error or power failure. But the undersea capsule is designed to be operated without maintenance for as long as five years.

Details about how it can achieve this that are scant, but Microsoft envisages that "after each five-year deployment cycle, the datacenter would be retrieved, reloaded with new computers, and redeployed".

As it outlines in an FAQ for the project, Microsoft expects the end of Moore's Law to result in a slowdown in the refresh rate of new server equipment.

"We see this as an opportunity to field long-lived, resilient datacenters that operate lights out -- nobody on site -- with very high reliability for the entire life of the deployment, possibly as long as 10 years," it notes.

With the project, Microsoft is joining others in the race for novel and cheaper ways to provide power and cooling for so-called hyper-scale datacenters.

For example, Google has bankrolled a number of wind farms in Sweden and the US, and relies on the Baltic Sea to cool its datacenter in Hamina, Finland. Meanwhile Facebook is taking advantage of Sweden's cooler climate and hydro power in Luleå.

According to The New York Times report, "Microsoft is considering pairing the system either with a turbine or a tidal-energy system to generate electricity."

Despite the risks of placing a datacenter in the sea, as the newspaper notes, Microsoft's researchers believe they could cut datacenter deployment times from two years to 90 days, offering it a cost advantage.

The other rationale for exploring undersea datacenters is that many cities are located on coastlines where real-estate prices are high. Placing the datacenters in the sea, near those populations, would help cut latency.

Microsoft is designing a larger underwater system and is also looking towards a new trial next year, possibly in Florida or Northern Europe, according to the report.

Read more about datacenters

Editorial standards