One of the biggest costs for any datacenter is energy usage. And while April 22 is Earth Day this year, it's possible to reduce your carbon footprint and save money every day by choosing energy-efficient servers.
Buying a server is a long-term investment. It's not only the cost of hardware and software that matter, but the power costs to make that server consistently available to users. Add in a fleet of servers in a datacenter operating 24 hours a day, and power costs quickly add up.
For instance, one server can use between 500 to 1,200 watts per hour, according to Ehow.com. If the average use is 850 watts per hour, multiplied by 24 that equals 20,400 watts daily, or 20.4 kilowatts (kWh). Multiply that by 365 days a year for 7,446 kWh per year. According to the US Energy Information Administration (PDF), the average kWh cost for commercial use from January 2012 through January 2013 was 9.83 cents. So that means it would cost $731.94 to power the aforementioned server for one year.
Add in the fact that energy costs vary around the country, with some larger metropolitan areas and remote spots such as Hawaii costing upward of three times the national average, and you can easily see why server energy usage is so crucial to a company's bottom line.
You can use the TechRepublic toolkit to calculate server power usage. It includes a spreadsheet that can help provide an average baseline of what you can expect to pay in energy costs for old/existing servers versus new servers. The list includes many common servers available now, with IBM, HP, and Dell among those included, as well as Oracle, Fujitsu, and Cisco.