I recently read some striking information about the changing role of IT in the enterprise—the average annual IT budget, when adjusted for inflation, is less now than it was 10 years ago. Yet, 10 years ago we didn’t have smartphones or tablets. We also didn’t have enterprise-wide business mobility, or big data, or cloud computing. It’s no wonder datacenter managers are always getting hammered on costs and efficiency. I believe this situation will change. IT spending will have to loosen up if companies plan on remaining competitive in a digital world. However, for many IT managers, maximizing efficiency and controlling costs are still the big game in town.
Lately, I’ve talked a lot about how virtualization enables you to get more out of a finite set of datacenter resources. However, virtualization is only one important cloud computing strategy that works to control costs and improve performance. Another is task automation.
Automation is important because some of the greatest costs in enterprise IT come from operational expenditures. A good virtualization strategy lays the foundation for greater task automation, which enables you to do more with the computing resources you have available. How does virtualization make this possible? Through hardware abstraction, virtualization makes it possible to standardize on platforms. This homogenization of the computational environment simplifies task automation. It also helps in the continuous struggle for budget allocations because it’s much easier to make a business case for developing tools that apply to the full portfolio of applications than it is to fund automation of each service individually.
What becomes possible with automation? Here are some of the advantages gained through automation in a virtualized environment:
- Automating common tasks and processes lowers costs, improves service quality, and increases user acceptance by reducing deployment errors.
- Automation greatly reduces the cost of compliance auditing by ensuring that the system adheres to all applicable regulations, standards and policies.
- Automation enables companies to recoup the cost of technology investments more quickly. It does this by accelerating the speed of new deployments, which not only lowers costs during the deployment process, but also reduces time to realized business value.
- Automation makes it possible for a given architecture to serve more customers at lower cost. This happens through an orchestration engine, which automatically reallocates resources to accommodate changes in demand from users or applications.
The most basic way to automate is to create scripts that customize and configure individual virtual machine instances as part of the provisioning process. This may be sufficient for autonomous systems that do not rely on the configuration of other systems. Once multiple machines connect as part of an n-tier application, you need to coordinate their configuration, which usually involves building a workflow.
A workflow is a sequence of order-dependent steps that require synchronization across multiple systems. Ideally, it will be resilient to transient problems in the infrastructure, such as individual failures or periods of disconnection. In complex scenarios, the program logic can be quite sophisticated, straining the limits of simple scripts and leading to consideration of a workflow engine. In fact, if you have many complex workflows, you should implement a complete orchestration system.
Virtualization is a key component of an on-premises environment not because it is the objective, but because it is an enabler of an architecture that lends itself to further optimization. Automation is the logical next step in the effort to drive down cost while also creating a more flexible and responsive infrastructure.