Experts love holding up cloud computing infrastructure as an ideal model of green IT because of the architectural efficiencies it entails (at least in theory). Most of us equate cloud infrastructure with massive data centers, but a new paper from Microsoft Research and a computer scientist from the University of Virginia offers a contrarian point of view.
The paper, "The Data Furnace: Heating Up With Cloud Computing," argues that server architects that the researcher call "data furnaces" could offer a lower carbon footprint in certain scenarios, particularly home offices or office buildings. That is because they theorize that the servers could put out enough heat to become a primary heating system for these buildings, if connected into a building's existing heat distribution systems and duct work.
So, instead of worrying so much about hot data centers are, we would work harder to redirect their heat where it could actually be useful.
The researchers write:
"Computers can be placed directly into buildings to provide low latency cloud computing for its offices or residents, and the heat that is generated can be used to heat the building. This approach improves quality of service by moving storage and computation closer to the consumer, and simultaneously improves energy efficiency and reduces costs by reusing the electricity and electrical infrastructure that would normally be used for space heating alone."
The paper proposes replacing electric resistive heating elements with silicon heating elements. Essentially, the idea is to use the same electricity source to create heat AND to handle computation -- allowing the IT industry to grow computing capacity without necessarily increasing overall electricity consumption.
Think of it: A homeowner could agree to site servers in his or her utility room, offering it up for computing tasks such as scientific processing or Web crawling at certain periods of the day or during certain seasons. That server farm could help subsidize the heating bill, especially at night or during the winter months.
Among the technologies that the researchers believe make the "data furnace" approach possible are maturing systems management technologies, sensor networks that improve physical security, and reduced component pricing.
The most likely place for a data furnace? Office buildings or apartment complexes that are capable of housing midsize data centers (in the hundreds of kilowatts range) and that are seeking ways to drive better energy efficiency. The paper explores cost and management scenarios surrounding all manner of data furnaces (aka micro data centers) that include between 40 and 400 CPUS.
One potential downside of the whole idea is that residential electricity apparently is usually priced 10 percent to 50 percent higher than power distributed to industrial areas; plus the cost of appropriate broadband services might also be a sticking point that make certain options cost-prohibitive, according to the researchers.
In any event, the paper envisions three major types of data furnace configurations that could potentially arise:
Seasonal ones that use low-cost servers to perform computations mainly at night or during the winter, offering some heat subsidy to the host building
Neighborhood ones that could help improve computing services because of their geographic proximity to the users
Urban data furnaces that would operate year-around (this is the apartment building example); these configurations would make the most sense in colder weather, much like the seasonal ones
It is pretty clear that, geographically speaking, the idea of a data furnace makes sense really only in places you would find a traditional furnace. Still, I get the sense that the idea is more than hot air.