How do you overlook 3,000 datacenters?

How do you overlook 3,000 datacenters?

Summary: It takes a major bureaucracy to lose track of where their data processing reside


While the attention has been focused on the new NSA datacenter in Utah, a re-evaluation, 3 years into the Federal Datacenter closure program, identifies an additional 3,000 facilities that fall under the closure consolidation mandate. Granted, under the loose definition of datacenter that the government is using that could mean 3,000 racks hidden in 3,000 utility closets throughout the country, but how can any organization not know where its data processing and storage facilities actually reside?

As reported in the Federal Times, David Powner, Director of Information Technology Management at the Government Accountability Office, told Congress that a recent estimate of the number of datacenters subject to closure had reached 6,000, which included 3,000 that had not previously been counted.  He followed that up by saying that after three years there were still no good, hard numbers on the total number of datacenters in use.

This kind of information really highlights the difference between IT operations in government and private business.  While there are stories of occasional servers being misplaced or even walled up in the business world, losing track of entire datacenters, regardless of how small, is something that simply wouldn’t happen in a business environment. With multiple agencies running their own IT and no explicit oversight that gives authority or responsibility to anyone further up the government chain there is rarely anyone with authority who can be held responsible for this poorly organized government IT effort.

With the original goal of shutting down 800 of the reported 2,100 datacenters that the government thought they were running three years ago becoming a less than tangible goal, the OMB has now decided that the target is 40% of the identified datacenters. The OMB is also now specifying that only “non-core” datacenters, be shut down, which could be a useful metric if they had bothered to define the term “non-core” so that it could be applied equally across the government departments.

For IT personnel who have long dealt with mandates from on high that rarely took the reality of the situation into consideration, these issues are unlikely to be much of a surprise. With a mandate handed down from the OMB without details or significant investigation into what implementation would actually require, the odds of this program hitting its completion date of 2015 seem to be very small.

Topics: Government US, Data Centers

Kick off your day with ZDNet's daily email newsletter. It's the freshest tech news and opinion, served hot. Get it.


Log in or register to join the discussion
  • Decent question.

    "How do you overlook 3000 datacenters?"

    It is VERY easy.

    1. Remember that the DoD is very goal oriented.
    2. The directive is "get the job done, no matter what".

    Given both of these, the "get the job done" will create a data center on the fly. If the job gets done, and works, that center will get expanded.

    Reporting up the chain of command isn't about "data center", it is about "we got the job done".

    Thus, every base will have several dozen unreported "data centers", even when that data center is only a couple of workstations put together in a server configuration.

    When the NMCI directive came out, trying to force everyone to use Windows (as that was the only supported workstation), thousands of such "data centers" were found - and justifications to bypass the NMCI came up, and were accepted.

    One simple case was that the windows workstation didn't allow for applications to process mail messages automatically, and identify data, extract that data, and process it as required. Previously this would have been done by a simple UNIX workstation using the normal simple mail handling.

    Doesn't work with windows - first you had to manually extract the message, then convert it to text, then scan for the data, and process it. Having to do it manually adds one entire person to the process, AND delays the processing by several minutes to several hours (depending on the message rate). Then it turns out that the only utility that could scan the messages... had been written in Romania.... and cost $10 each.... Not gonna happen.

    So justifications were thrown together to keep the old system, as a "required processing center".

    And thus, yet another "data center" appeared.
  • By the way - the same thing happens when directives...

    are about "no unclassified connections to the secret network are allowed"...

    When that happened, several thousand connections were suddenly identified. All were "temporary" to transfer 10-20 GB of data from an unclassified system to a classified one.

    The only authorized transfer was to put the data on tape (several of them at the time) take the data to the classified system, read it, then take the tapes to an authorized degausser... which had a tendency to destroy the tapes used (the tapes were several hundred dollars each for the size required).

    That made it uneconomical to use.. so the "get it done" grabbed an ethernet cable and connected the unclass system to the classified network... When the transfer was done, it was disconnected and the workstation reconnected to the unclassified net.

    Of course, that left the formerly unclassified system as a classified system...

    And the procedures didn't always get fully followed - cables got left in place.

    "Get the job done" supersedes security when security gets too complicated, expensive, or slow to perform.
  • Oh Of Course It Does

    Please. It doesn't happen in private businesses? It happens all the time. Who is responsible for all the machines in a corporation? What about when some new side project gets started in some department that may or may not be on site? Maybe that project dies out, and maybe it grows legs and becomes a big deal.

    Add consultants, folks trying to build something amazing at their house with their own gear, and it really does happen everywhere, all the time. Skunk works, baby.
    • I'll Second That!

      I once worked for a large company that was brought in to do projects large and small for major petrochemical companies. We were constantly finding unexpected "datacenters" that the central IT folks knew nothing about. Somes PC's acting as servers, sometimes actual servers bought by a department or plant that couldn't manage to work out something with central IT, sometimes old servers that were thought to have been shut down, and weren't.

      THe only reason the Feds number is so high is that they are just so dramatically larger than any one company.

      The shocking thing is that the number is as low as it is.
      Timothy Poplaski
  • I know everything.....

    about the woman I live with and I can't do a dam thing about it.
  • How many really know HOW many they have?

    I worked for a Fortune 500 Co (with 80 plants worldwide) on their Y2K project. Previous to this HQ really didn't know how many PCs they had at the remote locations. When we did an inventory for Y2K they were SHOCKED to learn they had over 4,000 PCs in the entire organization.
    And this company is noted for their organization. They had been relying on the individual plants sending inventory information without checking to see how accurate it was. We were finding PCs stuffed everywhere that had not been counted.
    If they were that far off on the first count, how do they know THIS count is any better??
  • How do you overlook defining an acronym?

    Specifically, what the hell is the OMB?