I recently had the notion that Web 2.0 applications are a form of desktop virtualization presented to me during a long conversation with a Kusnetzky Group client. I found myself "not in alignment" with this notion. What do you think?
Wikipedia's list of technologies used to create Web 2.0 applications.
Web 2.0 sites typically are using one or more of the following according the the Wikipedia:
Since these applications rely on the availability of a network connection that makes it possible to access the worldwide web and having sufficient bandwidth to support the download of all of the necessary application code and data, this approach is not applicable to folks who are working "off the grid," that is where access to the network is not available.
This is not all that much different than using a access virtualization in the form of a presentation manager, such as Citrix's XenApp or Microsoft's Windows Terminal Services.
Access virtualization technology is inserted into the environment, "grabs" the user interface component of the operating system, encapsulates screen data and the whole interaction with the user making it possible for the user to have the complete experience of running an application or perhaps the whole desktop experience offered by that remote system as if it were running on their own system. The applications are not changed to make this magic work. They use the same interfaces that the operating system offers that local applications use.
Application virtualization technology encapsulates an application and allows it to run in an enhanced environment that may provide greater performance, scalability or isolation from the rest of the environment. Once an application has been encapsulated in this fashion, it may be delivered to the target device by an installation, copying an executable image to the remote device and running it or streaming the encapsulated application down to the remote device.
Do you agree? What's your view on this topic?