Virtualization is a way to abstract applications and their underlying components away from the hardware supporting them. The goal usually is one of the following: higher levels of performance, scalability, reliability or to create a unified security or management domain. How does this relate to access to applications?
In the past, applications were written to support a specific type of user access device - usually a terminal of some sort and later on a PC running terminal emulation software.
As PC's became more powerful, portions of applications were implemented on the PC and client/server computing was born. At first, it was merely the user interface. Later on, quite a bit of the application migrated to the desktop or portable system.
While this improved the responsiveness of applications and offloaded expensive servers, it created other problems in the areas of security, system management and compatibility. Individuals often wanted to adopt the newest technology before the organization's IT staff was really ready. Critical data resided on self-managed systems and may not have been backed up. Users, unaware of or unwilling to follow security precautions, could introduce viruses, worms or other risks into the corporate network.
In order to address these problems, some organizations have re-centralized these applications or application components onto systems located in corporate or regional data centers. Users were able to access these applications using virtual access software.
Virtual access software offered a number of important features to organizations. Including the following.
- The systems were backed up as part of the organizations normal system administration functions. Important data would not disappear or be misused if the individual misplaced or damaged their local system.
- Users were unable to introduce viruses or worms into the corporate network because only the user interface was running locally. This reduced the risks the organization faced and reduced IT costs. After all, if the IT staff doesn't have to spend time chasing after worms and viruses, the organization could invest its budget in other things.
- The desktop, laptop or handheld system that supported the user interface didn't have to run the same operating system as that supporting the applications. This allowed the organization to provide the same applications to everyone regardless of whether the individual was using a PC, a Mac, a Linux system, or some handheld device like a Treo or a Blackberry as his/her chosen system.
Some organizations have installed special-purpose software such as Citrix Metaframe or Windows Remote Access others have chosen a Web Browser as the remote access mechanism and have moved towards a Web-based environment.
Which approach does your organization use? Have you personally been happy with this approach? If not, what would you do to improve it?