'

How much power should a server at rest consume?

I dunno either. But even my feeble English literature addled brain knows that humans burn fewer calories on a couch than on a treadmill.

I dunno either. But even my feeble English literature addled brain knows that humans burn fewer calories on a couch than on a treadmill. So why shouldn't we design our computers to use power in the same way?

Apparently some smart souls are doing just that. It's a concept called energy-proportional computing.

For those of you like me who haven’t heard this term either, it’s a philosophy that proposes that a server or any other piece of equipment for example should use a chunk of energy related to the workload it is handling. So, the amount of power that the equipment uses would be minimal when the system is idle and would increase as the load increases. An object at rest will remain at rest until called upon to do more. It's the concept that helps make your notebook last three and a half hours on the plane tray table instead of just three hours. (Yes, my PowerBook eats a lot of batteries. Sigh.)

There’s a great discussion of this idea in this article from Innovative Technology for Computing Professionals. Apparently this is as philosophy that Google embraces within its super-secret data center operation. (Makes sense because I believe their systems are constructed specifically for them, so they can dictate design.) Since I’m not technical and these writers are, I’ll let them have at it. Enjoy.