... is risky business, at best. After all, back in 1981, how many of us would have guessed that a twenty-something from Seattle, with a second-hand OS, would end up getting the best of IBM, let alone running the largest software company in the world. Nevertheless, I am beginning to wonder if Vista, with all of its promise, might just be an indication of 'the beginning of the end' for Microsoft. Let me explain ...
I've been reading plenty of mixed reviews about Vista here at ZDNet but it was a piece at eWEEK that got my attention. (See Vista's Moment of Truth.) The author was quoting Brad Wardell, the president and CEO of Stardock, a software vendor in Plymouth, Michigan who has been looking at Vista betas for awhile now. He anticipates further delays if the upcoming Beta 2 does not address certain 'problems'. Among them:
Memory ceilings and handles Wardell said he has two primary issues with Vista: its memory use and the way it deals with "handles," a type of computing resource that various programs such as e-mail and desktop search use.
As for the memory issue, Wardell said it's becoming increasingly difficult to add memory to boost performance. "We are now bumping up against the 2GB limit," said Wardell, adding that if Vista needs more than that to operate at a high level, there will be problems. As for the 2GB reference, Wardell noted that while 32-bit processors can access 4GB of memory per process in theory, the upper 2GB are reserved.
"Windows Vista uses considerably more memory than Windows XP—about twice as much—and there is not much reason to think this amount will significantly change by release. Realistically, until 64-bit machines become the norm, the 2GB limit is going to be a problem," Wardell said.
On the face of it, you might say 'So what?' but the 'programmer' in me tells me something different. From as far back as Windows NT 4 (the first truly stable and fully-compatible version of Windows), adding more memory always boosted performance -- and always allowed a sluggish processor enough flexibility to make the user productive.
Mr. Wardell is suggesting that Vista is now bumping up against the limits of the underlying 32-bit architecture -- first introduced with the Intel 80386 family of processors. Well, it was going to happen sooner or later, and now that mid-range workstations are available with 64-bit processors, it doesn't really matter what Vista's requirements are. Right? Well, maybe ... but I'm not so sure.
From past experience, I know that Windows XP is only barely functional at its recommended minimum of 300MHz and 128MB of RAM so I'd expect that to be the case for Vista with 800MHz and 512MB of RAM. (Actually, the recommended minimum for Vista is 1GHz and 1GB RAM.) I am now running Windows XP at 512MB in order to get best performance -- that's four times Microsoft's recommended minimum. This suggests that I will need a minimum of 2GB just to get the kind of performance from Vista that I get now from Windows XP at 512MB. If true, even my one-year-old 3.4GHz workstation could be sluggish with Vista.
Microsoft already knows that enterprise customers are not going to flock to Vista until after the first Service Pack appears but what happens if consumers discover that they cannot add enough memory to their existing computers to run Vista? Microsoft hopes they will buy new machines but unless Vista offers them a considerably better experience, why should they? Will OEMs find themselves having to sell Windows XP on its entry-level machines and relegating Vista to its premium machines? If so, it could spell disaster for Microsoft.
Could the memory limits of the 32-bit x86 architecture turn out to be a show-stopper for Vista? Probably not -- but if Vista does indeed miss the mark, it seems to me that it may be a symptom of a much larger problem at Microsoft.
Like most human endeavors, software development is cyclical. Most software vendors have one or two hugely successful products. Those products compete effectively through several hardware life-cycles, but sooner or later, their code base becomes bloated and unwieldy. All it takes is a small paradigm shift for that bloated code to become unmanageable. At that point software vendors have the choice of spending vast amounts of money to rewrite their code base from the ground up to comply with the new paradigm, or they can keep the code base they've got and just patch it -- for far less money.
Invariably, they choose the latter. It happened to VisiCalc and to WordStar, It happened to Lotus 1-2-3 and it happened to WordPerfect. Yes, Lotus and WordPerfect are still around, but they went bankrupt and were bought out -- in terms of market share, both are now shadows of their former selves.
Could this be happening to Windows? The signs are certainly there. Article after article talking about how wonderful Vista will be -- all the while bemoaning how much memory is required -- or how fast the processor must be to get acceptable performance.
If Windows Vista were to falter (still a long shot), what OS would fill the void? Despite the protests of many, I don't see any flavor of UNIX or Linux on the horizon robust enough to meet the needs of the average consumer. And, while MacOSX is fully capable of filling that void, Apple is not in a position to move into the commodity hardware market and stay profitable. There are those in the thin client and web-services markets who envision themselves as players in such a 'new world order' but that discussion is for another post.
No matter what happens, Microsoft needs to watch its back -- and re-evaluate its bloated code base -- sooner rather than later. Or one of these days, a future release of Windows will face a serious challenge from a more nimble competitor with a great idea!