1 of 11Image
Windows celebrated two birthdays this week. Windows XP was a decade old on October 25, and Windows 7 marked its second birthday on October 22.
Both operating systems have been insanely popular. In computing terms, XP is downright ancient, and yet it still accounts for roughly half the installed base of Windows users worldwide.
Meanwhile, Windows 7 is selling briskly. It’s earned overwhelmingly positive reviews, and the massive Windows user base is slowly but surely embracing it and moving inexorably away from XP.
Those two products represent high points for the Windows family, but there were plenty of low points in between. In fact, an unvarnished history of Windows over the last decade turns up its fair share of failures and big mistakes.
As a longtime Microsoft-watcher, I’m as fascinated by the company’s missteps as I am by its successes. Anyone who worked at Microsoft in the first decade of the 21st Century knows the impact that those wrong turns had on the company and its culture. How the company responded to those mistakes had an indelible impact on products that are on the market today and those that are planned for the future.
For this list, I deliberately ignored everything that happened before the public launch of Windows XP. That means, thankfully, we don’t have to rehash Microsoft Bob or Windows Me, nor do we have to go through the long and painful antitrust trial that ended earlier in 2001.
But that still leaves plenty of history. The ten case studies I've gathered here represent a mix of security gaffes, bad business decisions, and user experience failures.
They say every mistake is a teachable moment. So what has Microsoft learned from its miscues over the past decade?
Throughout the 1990s, Windows users already had been targeted by a multitude of viruses, many of which attached themselves to Microsoft Office documents. By 2001, the concept of a worm that could spread over networks was already well known.
Wisely, the designers of Windows XP included a firewall to protect users from network-based attacks. And then, in one of the great mysteries of our time, they decided to ship XP with Internet Connection Firewall turned off.
You can imagine what happened next. I remember it vividly. On August 11, 2003, Windows XP computers worldwide began shutting down. When restarted, they displayed this error message and went into an endless reboot loop.
That was the Blaster worm at work. Microsoft issued a rare (at that time) and extremely detailed security bulletin describing the symptoms and cleanup steps:
On August 11, 2003, Microsoft began investigating a worm that was reported by Microsoft Product Support Services (PSS). ... Generally known as "Blaster," this new worm exploits the vulnerability that was addressed by Microsoft Security Bulletin MS03-026 (823980) to spread itself over networks by using open Remote Procedure Call (RPC) ports on computers that are running any of the products that are listed at the beginning of this article.
The Blaster worm, along with the previous year's Code Red attacks, were aimed at code that was written before Microsoft got serious about security.
Months after XP shipped, in January 2002, Bill Gates wrote his now-famous Trustworthy Computing memo, which included this across-the-board order:
In the past, we've made our software and services more compelling for users by adding new features and functionality, and by making our platform richly extensible. We've done a terrific job at that, but all those great features won't matter unless customers trust our software.
So now, when we face a choice between adding features and resolving security issues, we need to choose security. Our products should emphasize security right out of the box, and we must constantly refine and improve that security as threats evolve.
The memo was met with scorn in some quarters. Wired characterized it as "no more than a public relations stunt" and CNET talked to a security expert who called it "a PR blitz, pure and simple." But the memo turned out to be a genuine catalyst, kicking off a retraining and reengineering effort that wasn't fully engaged until the end of 2004 and didn't begin to bear fruit for several more years.
In fact, one could argue that the emphasis on security caused some overreaction. (See UAC, a few years later.)