For the second year in a row, AOL has finished at the top of the Uptime Institute's Server Roundup, a competition that is based on the number of servers decommissioned and resources saved while businesses upgrade and repurpose their datacenter infrastructures. UI's researched has shown that decommissioning a single rack of servers can result in overall savings of close to $2500 per year when the costs of maintenance, software, and energy are all factored in.
In 2012, AOL was able to decommission 8,253 servers that resulted in savings of nearly $3,000,000. This was after decommissioning 10,000 servers in 2011. Second-place finisher Barclay's, the financial institution, was able to save over $4,000,000 in power and maintenance costs, but decommissioned only 5,515 servers. These were both clearly major restructuring efforts, and the other three finalists, who had submitted their numbers to the competition, were looking at much smaller numbers, with fewer than 600 servers decommissioned and saving ranging from the $100,000 to $800,000 range.
But even the smallest of the finalists, which decommissioned only 387 servers in 2012, still realized saving, on power costs alone, of over $100,000. Considering how tight IT and datacenter budgets are, that is significant cost savings that saves funds to be better applied to improving business performance.
Now, while it is unlikely that you have thousands of servers sitting and idling in your network now, there is a fairly good likelihood that more than 10 percent of your servers are severely underutilized, even in small datacenters. I've recently been shown industry reports that peg the percentage of underutilized servers at an average over 50 percent of the machines in any datacenter. Given the potential cost savings for even a few decommissioned servers, isn't it worth the effort to get a good grip on how the servers in your datacenter are being used?