Being green, in terms of IT and datacentres, only very superficially has anything to do with saving the environment. In reality it is about cold, hard cash — and how to spend less of it.
CIOs are facing a challenge. There is a relentless demand for more powerful datacentres and the cost of power is increasing. Combine this with increasingly strict environmental laws and the issue becomes clear — IT needs to do more with less electricity.
Last year, analyst firm Gartner claimed that green IT was the number one priority for CIOs and IT managers. A year earlier, at Gartner's 25th Annual Datacenter Conference, the firm predicted that half the world's datacentres will run out of power by the end of 2008.
half the world's datacentres will run out of power by the end of this year
That is not to say datacentres will go dark. Rather, they will not be able to draw enough power to meet the energy requirement of the latest high-density hardware, which can draw 30kW or more per rack. In turn, the datacentre housing this rack requires additional air conditioning, which draws even more power.
To address these issues, The Green Grid was formed. The Green Grid is a global consortium dedicated to efficient IT with members including AMD, APC, Dell, HP, IBM, Intel, Microsoft, Sun and VMware.
"In 2006, datacentres consumed 1.5 percent of the entire electricity of the US, and we projected that that would grow to 2.5 percent by 2011," Bumpus said. "That's several power plants worth," Winston Bumpus, a representative from the Green Grid, told ZDNet.com.au.
He said this is not just an issue for individual companies, it affects whole economies. "If we increase without finding ways to conserve, our economies will be inhibited by our inability to grow".
Solving the datacentre energy crisis
In order to meet these energy efficiency challenges, a series of innovations has sprung up, centred on hardware, software and datacentre design. The champion of these innovations is virtualisation.
Dan Chu, senior director of products at VMware said in an interview with ZDNet.com.au's sister site ZDNet.co.uk that there are seven million servers shipped globally every year. "Across [this environment], the average server utilisation is five to 10 percent" Chu said. This means there is a huge potential for reducing the total number of servers shipped through virtualisation.
Chu said this can result in server consolidation ratios of 10 or even 20 to one.
virtualisation can result in server consolidation ratios of 10 or even 20 to one
While datacentre virtualisation is growing in popularity, virtualisation scenarios aren't quite as rosy as Chu paints them. Kris Kumar, MD of Sydney-based datacentre design specialists 3iGroup, describes virtualisation as the "silent enemy".
Kumar describes a scenario in which a virtualised server moved from performing at 10 to 20 percent of its capacity to more than 80 percent capacity.
"In real power terms, a 300 Watt server which was running at 20 Watts is actually now running at 280 Watts. You are reducing the footprint but putting in more processing power, so the power per footprint has gone up," Kumar said.
Recently, datacentre company Digital Sense teamed up with engineering giant Emerson to build Australia's largest high density datacentre in Kenmore outside Queensland. Michael Tran, director of Digital Sense estimates that close to half of the energy entering the datacentre is used for cooling, demonstrating the massive heat output of a virtualised, high density datacentre.
Virtualisation, along with the different hardware and utilisation levels across a datacentre, will lead to "hotspots" on the datacentre floor. So if the datacentre is full of hot patches, he asks why companies attempt to cool the whole area evenly?
Dan Azevedo, Symantec's datacentre expert, advised "aligning the racks into cold and hot aisles for easier cooling", to increase efficiency. However Azevedo said that distribution of heat needn't stop there, and suggested "re-using the excess heat into the adjacent office areas, or using external cold air, depending on the climate, to cool the datacentres".
When it comes to power consumption, server utilisation is only one part of the problem. Another issue is the utilisation of the power itself in the datacentre.
According to a The Green Grid whitepaper, 35 percent of power that reaches a server is lost in power conversion. When combined with an un-virtualised datacentre, this means most power that reaches the datacentres does no useful work. "A study by a major microprocessor manufacturer found that IT datacentres typically burn more power in power conversion and cooling at light loads (zero to 25 percent platform utilisation) than the computer systems themselves are using," The Green Grid said.
One option for mitigating this process involves using more efficient power supplies. The Green Grid notes that typical power supplies used today in datacentres have an efficiency of 65-70 percent, where as power supplies with an efficiency of up to 90 percent are available.
While more efficient power supplies will often pay for themselves over the lifetime of an IT product, The Green Grid notes that IT managers often do not invest in such products. This is because IT managers do not factor in potential power savings when buying IT products, and in many cases are completely unaware of the power cost associated with their datacentre. The Green Grid says what is lacking is a clear set of metrics.
However an increasing awareness of power costs and supply are leading to another interesting development — the globalisation of the datacentre.
There has been a movement to shift datacentres to locations where power is cheapest and cleanest
There has been a movement to shift datacentres to locations where power is cheapest and cleanest. Jose Iglesias, a green IT specialist from Symantec, said that in the US, many datacentres had moved from the San Francisco area to the Pacific North West — where power generated by hydroelectric programs is cheaper and cleaner.
It is possible this may lead to a global market for efficient cloud computing, where computational loads are transferred not to where they are processed fastest — but to where they are processed at least energy cost. However, global competition resulting from data transfer to more efficient datacentres is still limited by bandwidth, Iglesias said.
Risk is opportunity
While discussions of environment issue often focus on apocalyptic predictions and ever dwindling resources, there is good news as well.
The sudden rise in green IT has created opportunities for vendors and business to profit from this blossoming market. Recent research from analyst firm Forrester has predicted that the global market for green IT will peak at US$4.8 billion in 2013, and then decline thereafter — as businesses reach peak efficiency.
Analyst group S2 intelligence predicts a potential market in "green accounting", where firms pay to accurately measure their environmental impact. Such measurement would likely be software based, and would help to satisfy government regulation, customers and trading partners.
Companies are keen to get a slice of the green IT pie. Microsoft recently announced it would be entering the green IT market — by developing software to help building efficient infrastructure.
Hardware manufacturers have also shifted their definitions of what constitutes more advanced hardware.
"Previously, there was a direct relationship between newer hardware and a larger power consumption," said Symantec's Jose Iglesias. However an awareness of power costs and heat issues have seen this trend slow down or even reverse — such as Intel's latest server chips.
As power costs rise and legislation becomes stricter, more and more companies will jump on the green IT bandwagon. However the principal catalyst behind this movement continues to be cost reductions that come through green IT initiatives.
At the end of the day, efficiency is more than environmental protection — it's good business practice.