The IT industry has a lot of time for all things virtualised at the moment. But, despite the bold claims made by some vendors, there are still questions being asked over the realities of deploying the technology. One key aspect concerns whether server virtualisation methods go far enough to improve application performance and availability. For many the answer is "no", and this is why newer methods, such as application virtualisation, are attracting interest.
The conundrum is that, while IT managers are being pushed to reduce data-centre costs, the number of users and applications is still on the up. The result is that IT is being pushed hard to do more with less. Meeting the service levels associated with more applications and users always meant more servers. Statically over-provisioning hardware may ensure that business demands are met, but the trade-offs are both idle and excess capacity. Enter virtualisation.
Why settle for one server running at 10 percent utilisation when the business can have four virtual servers and an aggregate utilisation of 40 percent? Why not six virtual servers and a utilisation of 60 percent? Why stop there? Why not go all the way to 10 virtual servers and 100 percent utilisation?
But the irony here is that, in the effort to reduce costs by driving utilisation and reducing footprint (and power consumption, and cooling costs), companies are often overlooking the reasons they spent all that money to over-provision resources in the first place: the applications. Are all of your efforts to virtualise servers and drive utilisation resulting in improved application performance and availability?
Let's be honest, the reason for all of this hardware in the data centre is to support applications. Applications are where it's at. The data centre is here to support the business, which means supporting the applications.
Ensuring applications are always available and at maximum performance is central to IT departments' days. They are the reasons the data centre is over-provisioned and under-utilised.
With this in mind, perhaps we should ask: how do organisations reduce the costs associated with over-provisioning and at the same time continue to ensure business agility and application performance?
Server virtualisation is about driving utilisation. Server virtualisation will let us drive utilisation to 100 percent if we want to, but does this really improve the reliability and availability of the applications that run your business?
Take a closer look at server virtualisation and you see that these technologies do not:
- minimise the process of installing and configuring complex application architectures;
- provision, activate, and monitor individual applications within the operating system; or
- automatically expand scalable architectures to meet increased business demands.
How then does an organisation not only achieve increased utilisation but also increased business agility, improved application performance, and resilience? The answer to this question lies in the emerging area of application virtualisation.
Application virtualisation in the data centre is the process of decoupling enterprise applications from the resources on which they execute. These resources are then aggregated into a shared resource pool and allocated to the applications which need them. All this activity needs to be based on policies which reflect the current priorities of the business. It's real-time matching supply with demand.
All resources are available to be allocated based on the minute-to-minute needs of the applications deployed in this environment. The applications most important to the business receive the resources they require, when they require them. When the resources are no longer needed, they are returned to the pool to be utilised by any other application.
The results are improved application availability, increased performance, and an operational environment that is more responsive to the needs of the business. And overall utilisation is driven upwards because the work is directed to where the idle servers are. Put it all together and you have a holistic approach to delivering IT resources as a service.
Peter Lee is founding chief executive of DataSynapse.