US government commissions datacentre guidebook

Summary:The US federal government has firmed up its IT waste-cutting strategy by commissioning a guidebook for modular datacentres that will help it specify and plan deployments

The US government has commissioned a guidebook to help it choose between different types of modular and container-based datacentres, as part of an IT consolidation drive.

The General Services Administration (GSA) and the Department of Energy's Lawrence Berkeley National Laboratory (LBNL) are working with industry consultant Mark Bramfitt on developing the guide, Bramfitt said in a statement on Tuesday.

"I am drafting a guidebook that charts a process of choosing from available container-based and modular datacentre technologies, with a focus on energy efficiency and the provision of supporting infrastructure," Bramfitt wrote in the statement. The primary goal of the guidebook is to describe "a specification and deployment planning process that will be relevant in the future", he said.

Modular and containerised datacentres are growing in popularity because they can grant greater energy efficiency and scalability than other types. Capgemini's recently unveiled Merlin datacentre used a containerised design to achieve improved power usage effectiveness and to better handle airflow and cooling needs.

The federal government is on a drive to consolidate its datacentre infrastructure. As of 2009, it operated 1,100 datacentres, up from 432 in 1998. In February 2010, the federal government's chief information officer, Vivek Kundra, issued a memorandum for the Federal Data Center Consolidation Initiative.

In the document, Kundra explained that the administration wants to reduce "the overall energy and real estate footprint of government datacentres". He also said a further goal was to "shift IT investments to more efficient computing platforms and technologies".

The planned guidebook will contain a case study based on a container deployment at the University of California at San Diego, with plans to add additional case studies, according to Bramfitt.

Topics: Networking

About

Jack Clark has spent the past three years writing about the technical and economic principles that are driving the shift to cloud computing. He's visited data centers on two continents, quizzed senior engineers from Google, Intel and Facebook on the technologies they work on and read more technical papers than you care to name on topics f... Full Bio

Kick off your day with ZDNet's daily email newsletter. It's the freshest tech news and opinion, served hot. Get it.

Related Stories

The best of ZDNet, delivered

You have been successfully signed up. To sign up for more newsletters or to manage your account, visit the Newsletter Subscription Center.
Subscription failed.