I got a note earlier this week from a former frequent contributor to these forums pointing me at an interesting site discussing a movie about the people who, and the ideas that, made APL great.
One of the things linked from that site is a web reproduction of Kenneth Iverson's 1979 paper Notation as a Tool of Thought; and my response to the person who sent me the reference suggested that he consider this bit from that paper:
The utility of a language as a tool of thought increases with the range of topics it can treat, but decreases with the amount of vocabulary and the complexity of grammatical rules which the user must keep in mind.
in a Java context.
Since then it's occurred to me that the core idea applies equally well to evaluation of today's virtualization hype - in particular it occurs to me that a user using a virtual PC (running on a real PC) across a virtualized network (running on a real network) to access a virtualized server (running on a real server) against virtualized storage (running on real storage) is using an incredibly complex "vocabulary" strung together with an astonishingly convoluted syntax to treat a very narrow range of topics.
In contrast, a dead simple Solaris Sun Ray set up can treat a much broader range of topics (i.e. run more software across a wider range) with greater reliability, better security, and at a fraction of the cost.
So that's really the question: if the statement is true for programming languages, why isn't it also true for systems? And if it's true for systems, isn't today's mania for virtualization just what happened with Java - as a result of which it has now "growed" beyond all recognition and utility?
Or, if you prefer a different formulation: how much more divergence from reality can your employer afford?