Ten things you won't find in your datacentre in 2010

Despite vendor hype, some much-vaunted developments will be conspicuously absent from the immediate future, says Manek Dubash
Written by Manek Dubash, Contributor on

The next 12 months may hold great promise, but certain much trumpeted developments are unlikely to materialise in the short term, says Manek Dubash.

A few years ago, the datacentre was a relatively static place. You ran your company business processes there and the servers and applications, one per server, stayed where they were put for years.

No longer. The datacentre is a dynamic environment, so that in a modern facility the datacentre manager will not know where an application is running. But while the datacentre of tomorrow will look different from today's rack-filled halls, a number of things will not change. Despite technology apostles full of the joys of new kit, here are 10 things you are unlikely to see over the next 12 months.

1. A green infrastructure
Truly green means zero or negative values for carbon dioxide. That state of affairs is unlikely to occur. However, green issues have reached the top of IT and facilities managers' agendas for the potential savings they offer, as well as a combination of cultural shifts and government legislation. There is also a growing awareness of how IT helps other industries, such as logistics, to save energy.

So you can expect to see the green marketing hype subside as the issue matures, with corporate executives remaining focused on environmental goals, while the emphasis will be for IT and datacentre managers to prioritise IT efficiency. But truly green? No.

2. Unified architectures
Vendors love unified architectures. You buy all your kit from one vendor: servers, storage, networking, all in one box. There is no finger-pointing, it should be more easily manageable and could make procurement easier. HP and Cisco, together with VMware and EMC, launched such systems in 2009.

The downside is you reduce your discounting power, and your technology choices are similarly limited by those the vendor will sell you. It is called lock-in. This scenario replays the mainframe's glory days in the 1960s and 1970s, when companies left IBM for the likes of Amdahl and DEC.

And in practice, there are other hurdles.

The first is internal politics, as it threatens IT departments' specialist teams. The second is that heterogeneous environments grow up as a result of company activity over time, from changing business processes and priorities to mergers and acquisitions. Converting that environment into a single vendor-owned box would appear to be a retrograde step, and could limit flexibility. Unified architectures may take off in the future, but not in cash-strapped 2010.

3. Enough computing power
This issue links into green computing. One way of being greener is to reduce the demand for computing power, yet demand continues to rise.

Instead, pressure mounts to do more with existing equipment and software. This trend means ensuring you are not paying for more software licences than you need, extracting maximum value from corporately held information using techniques such as business analytics, and maximising resource utilisation, whether it be computing power, network throughput, or storage capacity.

More capacity can be gained via smarter features as newer CPUs reduce virtualisation overheads and energy draw, while adding security and horsepower.

Fill a datacentre with such CPUs, and they could pay for themselves, says Intel. But cramming in more servers is no longer an option, as datacentre managers struggle to get enough power into the facility. More computing power, please.

4. Single pane of glass
When it comes to management, vendors promise a single pane of glass — being able to view the whole network and infrastructure from one console. But what they offer usually extends only to their own systems or possible other systems like theirs.

In practice, the existence of separate IT department teams focused on storage, networking, datacentre and servers, not to mention specialists in databases, development and analytics and business intelligence, means a single management tool is unlikely to emerge. There is not the political will to make it happen.

Suppose it did exist. If you are the storage guy, do you really want to see all the networking and business-intelligence information? If you are in charge of database development, would you be keen to be interrupted by alerts about a virtual machine that needs more RAM? Hardly.

In practice, it is unlikely the storage team can even view all storage, unless you have just bought all the kit from the same vendor.

5. The standard cloud
Cloud computing is at the Wild West stage now. Everyone does their own thing, hoping to beat the next guy, and there are few standards.

Yet if the financial director and the chief information officer decide the way to go is to outsource IT to a cloud provider, then helping to specify that provider will be your job. The problem is comparison. Without standards, providers use...

...different terms for the same thing, and have a habit of burying bad news in the small print.

For example, more than one cloud provider offers what seems like a huge discount should its systems go down — except their liability is very limited, once you read beyond the headlines.

And this means that, as with early computers, once you are locked into a cloud provider's systems, getting your data out again and into another provider's datacentre will not be easy. And if you cannot threaten to leave, it is going to cost more. So the lack of standards affects the bottom line, something the chief information officer needs to know, but not something you are likely to see remedied in 2010.

6. Desktop virtualisation
This shift will not happen in 2010 — not en masse anyway. Desktop virtualisation, unlike the server equivalent, is complex and expensive. Not only are there many more technology elements to the equation, and so many more ways of achieving similar ends, desktops are personal and people hate change.

So when someone proposes savings by removing the desktops and parking virtual equivalents in the datacentre, ask how much it will cost for the helpdesk to service large numbers of individual users who will be flooding the IT department for help because their desktop is not the same as before, and who will pay for the fall in productivity. 2010 is not going to be a year characterised by such risky behaviour.

7. Standalone servers
Say goodbye to the last of those tower cases, probably owned by test and development, sitting in a couple of random racks at the back of the datacentre. They are going away.

Not because they are not very energy-efficient — they probably aren't, but they are likely to be low-powered and few in number — but because it makes much more sense now virtual machines (VMs) are so easy.

Industry analysts are ahead of the game and are predicting the advent of unified integrated systems, complete with virtualised architectures. With those systems likely to represent a big upheaval in many datacentres, the odds of getting past the pilot stage in 2010 are not high.

But server virtualisation projects are now steaming ahead, the technology having proved itself, so it is child's play to replace that tower with a VM.

Towers? Just say no. Even the test and development team will agree.

8. No more IT specialists
You know how the IT team is generally split up into technology sub-teams? The barriers will start to crumble in 2010, as skillsets will be required across the board.

So expect the traditional specialisms of networking, software, storage and servers to merge and then diversify into different skill packages. Perhaps there will be an environmental specialism, one for overall systems management, and another one for security — you already have one of those? Good.

Become one of those specialists and watch your value increase. Either that, or freshly minted graduates, with all those skills bubbling away and ready to go, will come and take your job.

9. An end to data silos
Data silos were abolished when the company bought a SAN and poured all the data into it. Correct? If you are just looking at the datacentre, perhaps that is the case.

But look at it through the eyes of the business analytics specialist, the person who has to answer questions from the top execs who want to know why this or that business unit's profitability has fallen — or increased.

To answer they need access to a vast array of information, much of which is not on the SAN. Instead, it sits on a departmental NAS box, in spreadsheets or Word documents on end users' hard disks, or even on a USB stick somewhere.

You probably cannot centralise enterprise data as fast as end users can spray it around their myriad mobile devices. Data silos will live on.

10. Fat IT budgets
Expect the thin days to continue. Budgets will be available only for projects where payback is near-guaranteed, and fast.

As the economy slowly recovers, history teaches us that companies rely on freelancers and contractors rather than replenishing their employment pool to previous levels.

The same applies to infrastructure spending. Until the risk of spending on medium to long-term projects is reduced to manageable levels, IT will continue, in the well-worn phrase, to be expected to do more with less. The good news is that new technologies are providing more for less.

Editorial standards