Is using computers to heat houses such a daft idea?

Is using computers to heat houses such a daft idea?

Summary: OK, it's cold. Winter has finally come - or come back, given the cold snap of early December.

TOPICS: After Hours

OK, it's cold. Winter has finally come - or come back, given the cold snap of early December. And there's nothing like sitting in a freezing room, worrying about the cost of oil, to focus the mind on alternative methods to heat a house.

Meanwhile, we are told, computers are excessively power-hungry. Particularly desktops with their 200-watt-plus power supplies. All that energy has to go somewhere, according to the law of conservation of energy: indeed, it is largely dissipated through the various processing, memory and other chips as heat, which is then sucked away and blown out of vents.

To all intents and purposes, then, a computer is running as a heater, albeit perhaps a rather inefficient one (though by the same token, the only other places that energy can be released are into noise and light, so it is quite efficient, in its own way). The daft idea is, why don't we make more of this capability? That is, actually use computers as a source of heat?

A precedent has already been set by data centre designers. Server rooms are notoriously un-green, in that they gobble whatever power is made available to them and spew it out as heat which then requires even more power to carry away. News stories about data centres warming nearby buildings or even greenhouses appear with reasonable frequency - the cynic in me says that it's an attempt to deflect attention, but equally, it's good that the heat isn't being completely wasted.

So, why not homes? Mainframes house chips on ceramic substrates, and ceramics also used in heating elements - I'm showing my ignorance to an alarming extent but could it be that a technology used to insulate, could also be used to distribute heat? Perhaps not enough to warm a whole house, one might argue, but… let's think about this a bit. Rather than a radiator, would it be possible to create a wall-mounted device, architected to achieve high temperatures purely by performing calculations? Of course, these could be random floating point operations, but perhaps, more usefully, they could be non-random, programmed to achieve a goal - help with finding the cure for AIDS, for example, or supporting the search for little green men,

To take this one stage further (and I know I'm stretching things to the limit here), perhaps such processor time could even be rented. For money. To an organisation that could make use of it. Given an appropriate, pre-tested network connection, maybe Amazon could take advantage of my radiators for burst capacity for its elastic cloud service. At a cost, which could offset, or even pay for my heating bills. I might even turn a profit.

But is this really stretching things? After all, micro-generation from solar panels might once have been seen as mere fantasy, but a flurry of neighbours have bought into such schemes recently. To the extent that passing conversations have moved from being about the weather, to the return on investment that a sunny day can bring.

Home heat through processing might seem similarly out there, but the major vendors are at least thinking about it. Microsoft Research calls the concept the 'Data Furnace', suggesting that data center (sic) owners could save hundreds of dollars per server per year in cooling costs, if servers were distributed around willing (and appropriately equipped) households.

While the concept does have associated challenges, such as data security (which could likely be handled with encrypted virtual machines), it does have a lot going for it. Not least that less heat would be generated by centralised data centres, and more heat would be available for households, saving on personal fuel bills.

Indeed, it's hard to think of one element of technology that doesn't already exist to make this happen today. Perhaps, in fact, the idea of using computers to heat houses is not that daft after all.

Topic: After Hours

Jon Collins

About Jon Collins

Jon Collins is principal adviser at consultancy Inter Orbis. With over 20 years in the technology industry, he has worked in the roles of IT manager and software consultant, project manager, training manager, IT security expert and industry analyst.

Kick off your day with ZDNet's daily email newsletter. It's the freshest tech news and opinion, served hot. Get it.


Log in or register to join the discussion
  • I have looked into this issue quite seriously - I am a bricks-and-mortar architect. The issue is that the heat collected in a data centre is low-grade heat - very hard to transport/pump, or to recycle (the hard physics constraints imposed by Corot's Law) - the best straightforward use I could find was to co-locate data centres and swimming pools - the output/requirements match rather well.
    To heat homes, you would have to pump low grade heat over a wide, and this is lossy and expensive - just like electricity, which we ramp up to high voltages for long-range transmission - it is also hard to meter.
    The other issue is the rate of technological change that goes with data centres - they are very different now than they were 5 years ago, and the combination of Moore's Law and the enormous pressure of huge energy demand at a time of rising energy prices will ensure that the rate of change accelerates. Homes, and what we need from them, change much more slowly - so the risk of investing in infrastructure to serve them from data-centres is too great.
    For what it's worth, the best use I came up with was co-location with high-value fish farming - raising sturgeon etc. This is an energy intensive business that uses low grade heat.
    Any one interested email me:
  • @ 1000300053
    Thanks for this - very interesting ideas here. I've talked with a couple of datacentre operators about this and there seem to be two issues
    1) You can run the heat at high pressure through a lattice of thin exhaust pipes. This can provide underfloor heating in offices and save on overall heating costs
    2) The rise of low-powered servers from companies such as ARM could put a brake on the rate of heat inflation in datacentres, so over time this could grow to be a non-issue

    When it comes to fish farming, could you go into some more detail? I once visited a datacentre that was running its waste heat beneath an airport runway in Northern Europe to prevent tarmac cracking, does this seem like a good idea to you?
    Jack Clark
  • I think you are missing the point. The aim is not to pump heat from data centres to places requiring heating, but to actually site the computing power at the very places that need heating.

    it is easier to move bytes around than BTUs
  • A bit late but...
    This idea has been developped since 2010 by this company !
    François Nguyen