X
Tech

When should you tackle that data center rationalization project?

So here's a question: When, exactly, should a data center be overhauled? What is the best rational for a data center manager to use with upper management to undertake such a project?
Written by Heather Clancy, Contributor

So here's a question: When, exactly, should a data center be overhauled? What is the best rational for a data center manager to use with upper management to undertake such a project? And just how big of a deal should said data center manager make out of the green element as an argument for tackling this stuff in the first place?

Bill Peldzus, vice president of data centers for GlassHouse Technologies, an infrastructure consulting firm that just formalized its data center efficiency practice, said only a few of his clients have been forced into doing something because they were literally going to run out of power or capacity.

But green metrics have been an element of the last six out of seven projects that his team has been asked to consult on. The biggest challenge today is that there are few tangible benchmarks for what a truly energy-efficient data center looks and acts like. We have only ideals. Which is why data collection projects such as the new one backed by the Environmental Protection Agency are so important. "People are only just now getting around to measuring power utilization," Peldzus said.

One thing that is easier to measure is capacity utilization, although even this is hard for the average company to get their arms around. According to Peldzus, about 10 percent of the servers in a typical environment are sitting around doing nothing. What's more, as his teams have discovered, back-up sessions are being run against those servers along with full maintenance services. Even if you can't make the power case, it's pretty easy to demonstrate other inefficiencies in such an environment.

Again, though, this takes management discipline, which is not something a lot of companies have invested in historically.

Novell actually conducted a survey among 411 data center managers last year in conjunction with Lighthouse Research that found approximately 61 percent of companies either use a manual process or no process at all to track server resources. What's more, almost 80 percent still use manual means to reallocate server workloads.

The Novell-sponsored survey found that 67 percent of data center managers are evaluating management technologies in order to save space in their data centers, while 65 percent were considered power savings. More often than not, you won't be surprised to hear, virtualization was the mechanism by which they hoped to achieve this. Slightly less than half of the respondents already use virtualization, while more than half of the remaining respondents are evaluating server virtualization for the future. You can download the survey here for the full results.

Clearly, the survey results as well as GlassHouse's own experiences in the field demonstrate that the best investment current data center administrators and IT managers could make is simply in better management techniques for their existing servers and storage arrays. Until you develop a better working knowledge of what's already there, you can't possible know what should stay and what should go. The EPA, in its data collection project, hopes to gather at least a year's worth of data to create its own benchmarks. While that may seem a long time, we might do well to remember that it took years to get us into this fix and there will be no instant fixes.

Editorial standards