Why all the recent focus on "desktop virtualization"

Why all the recent focus on "desktop virtualization"

Summary: It seems that just about every day I hear from a newcomer to the virtualization software market that has focused its time and creativity on "desktop virtualization." I've got one or two posts to write on companies of this nature and still more are appearing.

SHARE:

It seems that just about every day I hear from a newcomer to the virtualization software market that has focused its time and creativity on "desktop virtualization." I've got one or two posts to write on companies of this nature and still more are appearing. Some are focused on accessing applications and data remotely. Some are focused on encapsulating and delivering applications. Others are focused on encapsulating an entire desktop environment and making it available remotely. The key question on my mind is "why?"

It's clear that long-time members of the IT community remember the days before PCs become the standard way of accessing applications and data. People had access to block mode terminals or character cell terminals. Once these devices were installed on a person's desk, they often worked for a decade or more before needing to be replaced. They didn't create an avenue for worms, viruses or other mischievous software to get into the corporate network. They didn't require extensive staff training nor did they require frequent updates that proved incompatible with current programs, processes or procedures.

The suppliers of PCs and PC software convinced organizations that all of the graphical content, animated screens and local computing power would be a benefit, would reduce overall costs and make life simpler for the IT department.

Now, all these years later, its becoming increasingly clear that while people need access to desktop computing, they may not need a general purpose desktop computer as the mechanism to get to that computing.

If blame for this is to be laid at anyone's feet, it's clear that the suppliers of desktop operating systems, desktop productivity applications and incompatible hardware platforms all deserve equal attention.

Now that the pain is well known throughout the industry, that virtualization technology has reached a certain stage making it possible to construct workable solutions and that networking media has reach levels of price and performance that make it a reasonable choice to move desktop computing off of the desktop back into the datacenter once again, clever people are creating solutions and bringing them to market.

Who would you blame for the need to go this route?

Topics: Hardware, Virtualization

About

Daniel Kusnetzky, a reformed software engineer and product manager, founded Kusnetzky Group LLC in 2006. He's literally written the book on virtualization and often comments on cloud computing, mobility and systems software. In his spare time, he's also the managing partner of Lux Sonus LLC, an investment firm.

Kick off your day with ZDNet's daily email newsletter. It's the freshest tech news and opinion, served hot. Get it.

Talkback

9 comments
Log in or register to join the discussion
  • Local computing

    "The suppliers of PCs and PC software convinced organizations that all of the graphical content, animated screens and local computing power would be a benefit, would reduce overall costs and make life simpler for the IT department."

    It's not just about pretty screens - it's about speed. Locality is a primary concept of computing, meaning that the closer the information is to the CPU's core, the faster you're going to be. Getting data over the network is much slower than getting the data from the cache of a CPU or from the memory of the local machine.

    "Now, all these years later, its becoming increasingly clear that while people need access to desktop computing, they may not need a general purpose desktop computer as the mechanism to get to that computing."

    How so? We do so many things with local computing it's difficult to imagine that any of this is possible without a general purpose computer.

    Personally, I just use desktop virtualization to test web pages on different versions of IE that normally cannot exist on a single PC. I don't use it to access stuff remotely, and frankly I don't see the need to have my computing anywhere else other than my local machine.
    CobraA1
    • Locality of [u]what[/u]?

      [i]It's not just about pretty screens - it's about speed. Locality is a primary concept of computing, meaning that the closer the information is to the CPU's core, the faster you're going to be. Getting data over the network is much slower than getting the data from the cache of a CPU or from the memory of the local machine.[/i]

      The presumption in desktop computing is that the bottleneck is between the processor and the user. In an enterprise situation, where data integrity requires that the authoritative data be on NAS, this isn't a sure thing.

      It may well be much more important to have the processing reside next to the NAS on a short but blazing fast fiber net, with the (relatively) low-bandwidth user interface operating over a less demanding gigabit network link.

      The PC's local user I/O architecture predates even 10 Mb/s local network bandwidth. For comparison, the PCI bus has less bandwidth than a current 1 Gb/s network -- so the display bottleneck isn't what it used to be.

      As solutions to problems such as user interface latency reach the "good enough" point, design objectives shift to addressing other issues -- such as system maintenance costs.
      Yagotta B. Kidding
      • Not just UI

        "It may well be much more important to have the processing reside next to the NAS on a short but blazing fast fiber net, with the (relatively) low-bandwidth user interface operating over a less demanding gigabit network link."

        Did I say anything about the UI?

        Besides - what happens if everybody tries to access the server simultaneously? In the situation you envision, you're obviously envisioning a system where there are multiple, even many clients. If your UI is over a network, then you may be sending constant UI updates to hundreds of people.

        One of the beauties of having local computing is that network bandwidth needn't be wasted on UI at all. Just send the small portion of data that the user needs at the moment, and let the user's machine take care of the UI.

        The concept can be applied to general purpose computing as well: If you can do some computation on the local machine without adding a burden to the network, why try to do it over the network?

        "The PC's local user I/O architecture predates even 10 Mb/s local network bandwidth. For comparison, the PCI bus has less bandwidth than a current 1 Gb/s network -- so the display bottleneck isn't what it used to be."

        The PCI architecture is considered outdated, generally replaced by PCIE. PCIE is capable of 8 GB/s, which is more than enough for gigabit ethernet.

        "As solutions to problems such as user interface latency reach the "good enough" point, design objectives shift to addressing other issues -- such as system maintenance costs."

        I imagine maintaining a large corporate network complete with NAS, fiber networks, and centralized computing systems is pretty costly. A few PCs on a basic network, on the other hand, is very cheap.
        CobraA1
  • RE: Why all the recent focus on

    Mr K,

    Give us a good post on Endeavors Technology. A company offering the whole visualization package, with decent patents and a growing army of supporters, Blackhawk etc etc......what you make of them, where do you think they are going and do you think they will be 'acquired'? They are still a sole entity for now and a bargain to boot when compared with the other players. C??10m cap at moment but surely an offer could be made which might please the majority of the massive group of share holders?

    MM
    MegMog
    • When they do something interesting...

      I'm actively following over 100 companies in the virtualization software space. I do my best to post something interesting on all of them from time to time. If you check, I've posted a number of articles on Endeavors Technologies.

      When they do something interesting or I have a chance to chat with one of their customers, you'll see more posts. Until those things happen, I'm engaged in speaking to all of the others in the industry.

      Dan K
      dkusnetzky
  • Sure sounds like

    X11 to me. You know, that hopelessly obsolete network-transparent keyboard/mouse/display/sound thing that the Unix people were doing twenty-some years ago?
    Yagotta B. Kidding
    • I sure do remember X11

      I was with Digital Equipment at the time X11 was released. It was, as old timers know, one of the outcomes of the Athena Project at MIT. If I remember correctly, Digital, IBM and Intel were primary sponsors of that project.

      Quite a few ground breaking technologies came from that project. X11 is just the beginning. Kerbereos sercurity service and the Hesiod naming and directory service were also important by products of that project.

      Thanks for bringing X11 up. It clearly is one of the forerunners of desktop virtualization and shows that virtualization technology is nothing really new.

      Dan K
      dkusnetzky
  • RE: Why all the recent focus on

    Well I find virtual desktops very useful - unlike some readers here I like to be able to work from home or indeed anywhere I happen to be. This is not possible through normal web access as some of the systems I need to use are only available inside the firewall at work and do not have open URLs. The advent of web based services, that can get past the firewall, has changed my life for the better5
    cymru999
  • Someone Got Fixated on Re-usable Code?

    But then everyone else discovered that the 're-usable' bits did not do things the way they wanted them done... probably because it's too much like hard work to work out what it was doing in the first place because the..... As a result they 'adapted' the original re-usable bits and wrote over the originals and.... whoops.

    So virtualization is some sort of stop gap to avoid or overcome the 'legacy' issues by dragging in the stuff you know works with your bits and.... localizing the legacy to your own implementation... assuming you are going to be able to control that?

    Has someone put the Reverse Thrusters on?

    'I say Tank Driver, you are about to reverse over CommonDialog.DLL!!'

    'SIR YES SIR!!!!'


    Keith
    Keith Mallen