What ever happened to desktop virtualization?

What ever happened to desktop virtualization?

Summary: Virtualizing desktop environments and applications for safe, secure, flexible delivery to many types of access point devices has been on the edge of the IT market for years. Why hasn't it taken over?

TOPICS: Virtualization

Back in 2008, I published a column that asked the question, "Why bother with desktop virtualization?" For some reason, I've been hearing questions about the same technology here in 2014.

What is desktop virtualization?

Virtual desktops, the combination of virtual access, application virtualization and virtual machine software, have been heavily marketed for years. It appears that the promise of lower operational and administrative costs, combined with the potential for higher levels of both availability and security, would have simply propelled everyone into working that way. The truth, however, is that rather than adopting virtual desktops, individuals have, instead, have adopted other technologies including apps on their smartphones or tablets or accessing applications using Web-based methods.

Is this the best approach to deliver productivity or business applications?

It is clear that managing desktop systems, dealing with both operating system and application updates, helping folks deal with things that went bump in the night, and other issues such as security for mobile systems is costing organizations more time and money than they'd really like to invest. Much of this is being discussed in the industry under the headings of "mobility" or "BYOD." Desktop virtualization offers the hope of getting this under control.

What gets in the way?

As mentioned in other commentaries and posts and in no particular order, here are a few of the issues that get in the way.

  • Today's approach to desktop applications  (i.e., applications and operating systems running directly on physical systems) is workable albeit complex and expensive. The first golden rule of IT (see Reprise of the Golden Rules of IT for more detail on the golden rules) says "if it is not broken, don't fix it." So, companies continue to use this approach because it is workable.
  • Most desktop virtualization approaches require some changes to operation of desktop systems, applications and the like. The second golden rule of IT is "don't touch it, you'll break it." Since company's back end systems are so complex, many have chosen to leave well enough alone on the front end.
  • The performance characteristics of virtualized systems differ from physical systems and while those differences get smaller and smaller with each generation of hardware and virtualization technology, differences often lead to support issues. While these can be minor on a person-by-person basis, larger organizations can see these issues add up into a significant investment in support time.
  • People have come to see the machine issued by the company as their tool. They've often customized the systems to better fit their working style, and their personality Some of these changes would be problematic in a virtual world. The problem, by the way, isn't the customization itself. It is with the approaches used to move people's work environment from the physical to the virtual. It is far easier to move standardized work environments than it is with a whole herd of one-off environments, so IT administrators often try to use this migration as a way to lasso users and drag them to a single standard environment. People don't like being lassoed and dragged.

Citrix, Microsoft, VMware and Parallels and VMware, have been offering tools to make migrations from physical to virtual much easier. From time to time, I've tried to use these solutions only to find that I couldn't make them work within the time I could spare to play.

The human element comes into play

Furthermore, many of the issues revolve around the human consciousness and not around technological issues. It is quite possible that the trend towards cloud-based personal productivity applications such as email, document management and calendar management might make it easier for organizations to move to virtual environments.

What's really happening today?

What appears to be happening instead, is individuals have chosen a different approach. They've asked their companies to encapsulate workloads and offer them as IOS or Android apps or make those applications accessible through internal Web sites.

What is your company doing along these lines? Is it pushing desktop virtualization or turning towards offering mobile or web-based access?

Topic: Virtualization


Daniel Kusnetzky, a reformed software engineer and product manager, founded Kusnetzky Group LLC in 2006. He's literally written the book on virtualization and often comments on cloud computing, mobility and systems software. In his spare time, he's also the managing partner of Lux Sonus LLC, an investment firm.

Kick off your day with ZDNet's daily email newsletter. It's the freshest tech news and opinion, served hot. Get it.


Log in or register to join the discussion
  • Because its slow

    The UI of a virtual desktop is sluggish, poorly drawn, and unresponsive compared to a native desktop. I don't know of any client technology - not Citrix, not VNC, not RDP, that truly solves this.

    As much as there are some IT cost savings here, it all adds up to a big drain on productivity.
  • Slow and totally dependent on connection

    I lived through the days when you had a printing or CRT display terminal on your desktop (or on a table in a mass time sharing room) which was connected to a host computer -- most of the time. When the connection was broken, accidentally or by management order (in some of the environments, the backup CPU for the critical online system was used as the batch and programming support system, so you never heard the PA system call IBM to the ONLINE 195, it was always to the OFFLINE 195 -- the one that HAD been online until about 10 seconds ago!), you lost an unspecified part of the job you were running, you lost the file you were editing, at least since the last interim SAVE, and you could do no work (except, if working on a printer, you could take the printout to your desk and write on it). All you had was the current screen displayed; and that would be cleared when the system came back up, at the latest.

    Applications running on your physical machine can do some of the work, which does not absolutely require exchanging data with others, while waiting for the network (or the server containing the shared files) to come back up.
    • RDP works quite nice

      at anything better than 256kbps. The more the merrier. But at 256kbps you can get all your serious work done.
  • Yes, what happened to virtualization

    Back in 2008 I thought Azure was desktop virtualization. I looked and looked and discovered it was just enhanced web hosting.

    The most useful business app on a tablet or large phone is a
    RDP client. Once you get a Windows desktop, you get everything - Office, Outlook, and what have you. There is no need for Microsoft or other Windows application developers to webify their apps. And RDP enjoys the advantage of central management and true mobility. Blast your tablet into pieces with a sub-machine gun? Find another tablet and continue at the exact mouse position where you last were.

    Every tablet is potentially another Windows CAL.

    Note however the limitations of a tablet, eg poor typing, are still there.
  • Actually, the number 1 complaint I hear is...

    the fact that most environments aren't touch friendly at this point. Users want to access these environments from their tablets, but when they do, they grow frustrated because they're using some app that was built with a mouse and physical keyboard in mind rather than a touch interface.
    • You can VDI W8

      I have a VDI of W8 spinning and access it with my RT device. I sometimes have to remember which platform I am on when using it because on a good connection the touch UI on the VDI is very good.
      Rann Xeroxx
  • Cost, cost, cost, cost

    The problem is cost of implementation and maintenance. First of all, currently every new PC receives a "free" Windows license as part of the purchase. Any IT shop can service these using their existing infrastructure of servers - heck even "old" servers running 2005/2008 can easily accommodate hundreds of users. Now walks in the "virtualization" expert, and the first step is to buy some pretty high-end server hardware (we're talking 128 Gb of RAM or more, not to mention some pretty high-end NAS for all those desktop settings). Next all those new PC's you just bought - guess what, every single one of them needs a VMWare license, plus you need VMWare licenses for every processor (real or virtual, who knows anymore?) on the server. Then, just when you think you're done shopping for software, Microsoft mentions that you still need to purchase full versions of their OS - for every client system. You might not need to buy one for every physical system (how many users log in at the same time again?), but you're sure not paying the Lenovo/Dell/Acer/HP OEM price per copy either. By the time you're done, you're mentally exhausted, psychologically drained, not to mention you just blew way past your budget two-fold. And all that...JUST TO HAVE WINDOWS RUNNING ON YOUR USER DESKTOPS. Don't get me started on installing Office and all the licensing issues that come with that.

    I will never understand how VMWare goes around and tells everyone that their VM Desktops are a cost savings. My little spreadsheet told me that the initial implementation is easily 3x higher in cost than upgrading regular (non-VM) servers and desktops, and that I'll never recover that cost if I ever want to upgrade my users again. In the end - it will increase your REAL DOLLAR COST, not you "imaginary savings". It might save you labor, but you sure pay, pay, pay all that savings right back in license fees; The first one with their hands out will be VMWare, at $2,500 per server core just to run their software, never mind the user license fees etc. Gotta love the VMware "antivirus/protection" - ~$100 per virtual system?!?! Sheesh, at that rate I can buy four (non-VM) antivirus licenses for three years for each desktop?!

    At those prices, Virtual Desktops are STILLBORN.
    • Perfectly Captures the issue!

      Thanks for perfectly elucidating the issue!
    • A way around it...

      You can run Hyper-V on a cluster and you pay only for the server licenses. Then with the desktop VDIs, just connect with them with Surface 2/RT devices. You get a free VDI connection license with RT. As far as at the desk, place inexpensive 22 or 27" monitors to connect those RT devices to along with a mouse and keyboard.

      Not the most eloquent of solutions but works. Frankly the on prem VM farm will soon be a think of the past as AWS is now selling Windows Terminal Service sessions and Mary Jo F. notes a rumored project to start spinning up workstations on Azure for a full Windows desktop. Tie your Azure environment directly to your LAN to create a WAN and you can now spin up and kill VDIs and manage them without overhead and capital expense. This can also help with BYOD laptops or ChromeBooks, or whatever.
      Rann Xeroxx
  • Why it's broke...speed.

    I use RDP every day to connect via Gateway Services to my work desktop (physical box). It is FINALLY fast enough using off-the-shelf affordable Internet access at home to be really productive. But that is RDP->Gateway->Physical Desktop. Even then, sometimes (especially manipulating menus with the mouse) it can have a lag that can get to be a pain. But to prove virtualization makes it fail, I have to get to a virtualized server from that desktop for some tasks. Now taking the hit from the second RDP into the virtual machine from the RDP-ed into desktop is almost unusable. It can be done, but to do it for long periods only patient souls need apply - especially when you know what you had. And that's the hit - maybe if that is how you started it would be passable. But if not, then you will be busting down the doors to the systems guys (who are doing all this because they lust for the days when they had one big machine to rule and full control) and pitching them and their VMs out the nearest high window.

    Now if they could get the speed right, there may be something. But when we are in a world where even phones aren't fast enough year over year - fat chance!
  • Did you miss the Amazon Workspaces and Appstream (hosted VDI)?

    Running the hosting center for VDI is not simple. You didn't mention the Amazon Workspaces and Appstream announcements last fall? I'd like to see how these fare in the market - a hosted solution seems to make a lot more sense...
    • Seeing for ourselves...

      I invited a AWS rep to see us next week and walk our enterprise through their off prem solutions. IBM also has about the same solution as well. If I am not mistaken, these are Windows Terminal Service sessions as Server is currently the only Windows product licensed to be served this way from a 3rd party.
      Rann Xeroxx
  • I use desktop virtualisation ...

    I run Lubuntu natively on my laptop and have VirtualBox installed upon which I have my Windows7 Corporate OS, a WindowsXP OS with some Emulation Apps on, and a WindowsXP OS with iTunes on, plus a few VM's I create from time to time to test things like ChromiumOS. WOrks pretty well for me - I know I'm taking desktop virtualisation, as per this conversation, on a tangent - but as a Windows Admin who's Linux-curious, and wants an OS he can boot to quickly for web browsing (Lubuntu), I find it invaluable.
    • Well what happens if....

      One of your virtual Windows machines gets a virus and trashes Windows? You're dead in the water! Unless you have another copy of that Windows VM. Hmmm. Interesting. If you do indeed keep a backup VM on the ready, it sounds like you're a pretty good engineer.
      • Mac

        Well, on a Mac, I use Parallels to get access to Windows 7, 8, a couple of Linuxes, etc. And yes, I have backups of every VM because it's easy to do this on a Mac with Time Machine.

        The performance of every one of the VM's, even for processor-intense tasks, is good. This is definitely better than multi-booting a machine with each of the OS's installed, even though letting each run native is faster (been there). It's just not a whole lot faster, and the ability to run multiple OS's simultaneously is certainly faster (with one box) than rebooting it.
      • If you have Windows virtualized...

        Just set it to snapshot every night or so and have a scrolling window of snapshots. That way even if the session gets so infected that it affects Restore Points, you just set it to a previous snapshot.

        I expect that Windows 9 this might come mainstream on the PC. This can already be done with Windows Enterprise ed. and we see the XBox One with a Hyper-V layer that the OSs ride on.
        Rann Xeroxx
      • @SteveAbaby


        Since I run the host OS as Lubuntu and all my VM's are isolated from each other I don't foresee that problem. I have to admit I have managed to lock the laptop up once or twice before meaning I've had to hard-boot the Lubuntu host and then boot up my Windows OS, but it's been fairly infrequent (bi-annually at most?), certainly more infrequent than how often I could cause Windows to collapse in on itself.

        However, I do maintain a copy of my Windows OS on backed-up remote storage and also as someone suggested below me snapshot and roll back too. Part of my role entails me finding solutions to issues such that I need to install all sorts of software, replicate failures and then resolve the issue. I was tired of having to constantly rebuild systems from images or constantly run third party clean-up utilities.

        My setup enables me to have a 'golden' installation, then mess around with it, then snapshot back afterwards back to it's ideal state. I periodically snapshot due to intentional changes such as OS patches etc, and then of course make a new off-disk copy for fallback.

        I also setup read-only mounted shared folders in VirtualBox so that each virtualised OS can share data if needs be. I used to run a single Windows OS on a C:\ partition and have business data on D:\ for example, music on E:\, video on F:\ etc now I have an OS partition and an EXT4 data partition which has root folders called 'Data', 'Music' etc etc and I can just share these with any VM I build. Once of my VM's as mentioned runs nothing but iTunes (and MediaMonkey) and is very stripped down. If I want to listen to music, I boot Lubuntu in a few seconds, resume/un-suspend my 'Music' VM and away I go - unencumbered by all the fluff and boot delays of powering up my work OS which has about 15 system-tray applications that autostart (you know the kind of thing - the printer system tray app, the java quick start app, some other business tools, corporate AV, the corporate remote-desktop tool etc etc).

        The bonus is also, if my laptop ever dies, I'm only a short time away from getting up and running. I just need to copy my image to a new system. Effectively as long as a remote location has the capability of running VirtualBox on a system, and has an Internet connection, I can grab a copy of my corporate OS (upon which I've installed my VPN software) and can gain connectivity to work with all my business apps within 20minutes.

        I also have the benefit of being able to go through the proxy Servers at work (as we're supposed to) for Internet access inside my VM's, but at the same time can point my native Lubuntu Host to a guest WiFi spot and have access to the unfiltered Internet outside of my corporate data - simultaneously.

        The purposes for use are endless.

        Incidentally, I chose VirtualBox as it seems to be able to run under Linux and Windows, and have a small footprint. It therefore doesn't tie me into any particular ecosystem and I can run my OS's hardware agnostic.

        Hope the above helps :o)
    • Lubuntu

      That's a new distro to me. How does it compare to xubuntu?
      Rann Xeroxx
      • Lubuntu vs Ubuntu


        I'm not too familiar with either distributions in comparison as I'm from a primarily Windows OS background, but I've found Lubuntu to be very lightweight with enough capability to perform web browsing and run VirtualBox competently which are the two purposes of it being my boot-up OS.
  • One word: Bandwidth + Two Words: Web Applications

    Here is another important word: Economics

    As someone who is intimately familiar with this topic, I can tell you that desktop virtualization does not have legs.

    If you are putting together a call-center or looking to upgrade your company infrastructure to be more flexible and secure, then the answer is simple: Web Enable Your Applications

    It will be way, way cheaper than "virtualization"

    Desktop Virtualization is like a treadmill, it is seductive to see a demo and think "wow, this is easy"

    But it isn't, demos cannot capture the problems that you will encounter, bandwidth is just one of the most important, but it also is important to remember that "Legacy Apps" based on Windows are THE PAST, and keeping that past alive will cost you in every way.

    Meanwhile, Web Apps, are cheaper to implement, and have no "scalability" issues like virtualization does.

    Desktop Virtualization was and is hyped by the hucksters that are looking to make money off YOU, the mark. But think about how this idiot technology works for a second! You will run a "virtual Windows session" on an EXPENSIVE server in some back room in your office, to access this "Desktop" you will need a special Windows client that is just a watered down version of Windows that costs around $400 bucks! Yipes! The costs keep mounting too!

    Meanwhile, a meager investment in some "porting" of your internal apps to Web Apps could have saved you big $$$