So far, energy-efficiency practices have dominated the quest to cut power consumption in the data center. That work will continue to progress, of course, as it should. But a team of researchers led by AMD has just embarked on a research and development project in upstate New York that seeks to explore how renewable energy can be better integrated with data centers.
AMD, along with a team including Hewlett-Packard, the New York State Energy Research and Development Authority (NYSERDA) and Clarkson University, plans to test the use of wind power (the initial focus) as a data center power source. Wind was picked because there is less data available; besides it happens to one of the most widely available renewable energy resources in the region where the tests will take place.
Specifically, AMD and Clarkson intend to study how management algorithms at the processor level might be used to shift resources across a computing grid depending on where the electricity supply is most ideal. So, for example, if servers in Albany are being powered by wind energy that starts to fade, the computing tasks could be distributed to another location where the energy supply was more reliable. The project is being funded by NYSERDA and a number of private funding sources that aren't specifically named. The host site for the project will be Clarkson.
"This research starts to break down the notion that you need to be in a certain location to have a data center," said Brian Berry, project manager, green data center research program, NYSERDA.
The first phase of the feasibility study, which will go on for approximately 18 months, will center on simulating how different wind conditions might affect electricity supply across a distributed grid of computing resources. The team also is working with AWS Truepower, which has developed a model for simulating these conditions, according to two representatives from the project team whom I spoke with in late July.
After the simulation of these ideas is explored, Clarkson students will apply the algorithms gathered during that phase to a small-scale test using HP Performance Optimized Datacenter (POD) technologies that use the AMD Opteron processor.
Steven Kester, AMD's director of government and regulatory affairs, said AMD's interest in this research relates to the rise of cloud computing at the impact that is like to have on technology sourcing decisions. "Power becomes the inflection point," he said.
In the press release describing the project, AMD Corporate Vice President of Research and Advanced Development advances that explanation:
"The distributed computing model of the cloud parallels the distributed power-generation model of solar and wind energy. Directing power to data centers from these emerging renewable energy resources without relying on a large-scale, traditional electrical grid is a key challenge. One ultimate goal is the co-location of dynamic energy resources with dynamic computing resources to improve the economics, performance and environment benefits of both infrastructures."
Hmmm, does that mean that in the future, the corporate decision about where to place a data center might rely less on where there is a traditional electricity source and more on where there is an intersection of geographic factors that favor practices such as free-air cooling or the deployment of renewable energy technologies?
Clearly, that's a large leap, but one that may seem less like a fantasy scenario just a few years into the future.