Microsoft knew this day was coming. This was the reason it desperately wanted -- no, needed -- to take down Netscape in 1996. Netscape wasn't just trying to build a program for reading text and photos across a network of connected computers. Netscape was trying to build a new platform - the ultimate platform - to run software and share information instantly and on a global scale. And no one understood that better than Bill Gates.
Gates had recognized a similar shift a little over a decade earlier when he first saw Steve Jobs' Apple Macintosh and its graphical user interface. Gates knew it would make his text-based operating system, DOS, irrelevant. So he created Windows and eventually stole Jobs' thunder.
It took Gates slightly longer to pick up on the power of the Web, but once he did he immediately grasped its potential to make Windows irrelevant. That's why he catalyzed Microsoft to create Internet Explorer and drive Netscape into oblivion, by any means necessary. By 2000, Microsoft had pulled off the great reversal, taking 80% share of the Web browser market, which Netscape had dominated at 80% just four years earlier.
All of this was based on the idea that the Web browser would become the universal computing platform. But it didn't happen overnight. It didn't happen in 1996. It didn't happen in 2000. It didn't even happen in 2007 - the year Windows Vista arrived while the tech world was fixated on Web 2.0 and "cloud computing."
There are a lot of reasons for the failure of Windows Vista, but in retrospect the biggest reason was that the OS simply didn't matter that much anymore. Most of the consumers who ended up with Vista simply got it because it came installed when they bought a new computer. The vast majority of them never chose Vista.
The group that did have a choice with Vista was businesses and they chose to avoid it, although not because of any inherent inferiority of Vista. Vista has been very usable since Service Pack 1 and since vendors finally updated their software and drivers to work with it by early-2008. The problem was that there was never a compelling reason to upgrade to Vista. It was the software equivalent of repainting a room and rearranging the furniture.
Now we have lots techies singing the praises of Vista's successor, Windows 7, which will be released later this year. I just got finished testing Windows 7 for two months. I used it as my primary production machine at the office every day. I installed it on a high-powered 64-bit Hewlett-Packard desktop machine. I loaded all my apps on it. It worked fine. However, my conclusion on Windows 7 was, "So what?" There's nothing in Windows 7 that matters. In fact, the computer operating system has never mattered less than it does today.
As some commentators have suggested, there may be a bunch of IT departments that adopt Windows 7, but if they do it will be out of annoyance and necessity (if Microsoft finally phases out Windows XP) and not out of the desire to benefit from any new advances in Windows 7. There are none.
It didn't used to be this way. Installing a new operating system used to be like getting a whole new computer. Installing Windows 95 over Windows 3.1? That was a huge improvement. Installing Windows 2000 on top of Windows 95? That was a big leap forward. There were reasons to upgrade back then, for example:
Part of what's going here is that the computer operating system has achieved a level of maturity and efficiency. You could even say that work on the OS has reached a point of diminishing returns. How much more efficiency can we wring out of it? What other major innovations are waiting out there?
Some claim that touch-based interfaces are the next major leap forward for the OS. I would argue that touch will have very limited and specific uses and will mostly be used in usage scenarios with short bursts of activity and not for prolonged work or data entry.
It's possible that a combination of voice and touch could revolutionize the user interface (and thus the OS), or that another major innovation could make it faster and simpler for humans to work with computers, but for now the keyboard and mouse are as efficient as it gets. And, as a result, the computer OS has stagnated.
And, of course, the other thing that's going on is that the Web browser is finally usurping the OS as the universal platform that was envisioned back in the mid-1990s. Please note that I'm not talking about cloud computing or software-as-a-service (SaaS). While applications and services delivered over the Internet are certainly part of the ascendency of the Web browser, they still have not reached critical mass in the business world and the trend is bigger than that.
What we're seeing is that many businesses are using the Web browser as the front-end application to access private, back-end systems, from databases to CRM to ERP to payroll to corporate portals. And, why not? Since most users are very familiar and comfortable with Web navigation and Web forms, these corporate systems can tap into that experience to provide applications that have an easier learning curve than Windows-based business apps with their unique menus and interfaces.
If you combine that with the fact that many users now keep their personal e-mail and files in Web-based systems such as Yahoo Mail and Google Docs, you have a situation in which the average user spends most of her computer time in a Web browser.
That's why tabs have become a standard feature on all of the major Web browsers, because most users now have multiple Web sites open in the same way as having multiple applications open in an operating system.
Today, when I go to a new system or reinstall the operating system on an existing system, the first two things I do are to install Firefox and then Xmarks (formerly Foxmarks), which syncs all of my bookmarks. Those bookmarks include links to all the Web-based applications and tools that I use at work. Once that's done, I can do 80% of my work without installing another application. And I can do those two steps on Mac OS X or Linux or Windows XP or Windows 7. It doesn't matter.
Now, I'm not saying that the OS never matters anymore. Clearly it still matters for netbooks, which didn't take off until they started offering Windows XP as an installation option - but that's because users are much comfortable with XP than some of the unfamiliar Linux interfaces that came on early netbooks. Since netbooks are mostly about Web browsing and e-mail, you could see Google Android become a popular netbook platform, especially if it's super-simple and has a lower price tag.
The other place where the OS still matters is on the smartphone, but the smartphone is at the state of development and adoption that the PC was two decades ago - although it is going to accelerate even faster.
Platforms such as the iPhone and Palm's forthcoming webOS have shown that there's still a lot of room for OS innovation in the smartphone market. But, the biggest benefit of both those platforms is a better and more standard Web experience. As more smartphones adopt the same approach, the distinctiveness and importance of the smartphone OS will naturally diminish. The most important thing will be that a user can access Outlook, Gmail, Twitter, and online communities on a smartphone with the same ease as on a PC.
Twenty years ago, we thought the computer was the revolution, but it wasn't. The advent of the Internet - and the Web browser as one of the ways to harness it - has shown us that the revolution is actually in communications and the dissemination of information. The computer will be to the Information Revolution as the assembly line was to the Industrial Revolution. It will simply be one of the catalysts that helped make it happen.
In the same way, the computer OS simply doesn't mean as much as it once did, or at least as much as we once thought it did. But, then again, all of us (including Bill Gates) knew this day was coming.