7 of 11Image
Before there was the iPad, there was the Tablet PC.
Bill Gates proudly introduced Windows XP Tablet PC Edition (a variant of Windows XP Professional) in 2002, and he operating system got a major update in 2005. Its features were rolled into Windows Vista in 2006, and the entire pen-and-touch input system was refined impressively in Windows 7 in 2009.
And then the iPad came out and made Tablet PCs look like something from a prehistoric time.
What went wrong? If you look closely enough, three problems emerge.
First, the hardware available in the early 2000s simply wasn't good enough to make the tablet experience fun or interesting. Tablets were heavy and hard to hold, and they didn't have enough battery life to get through a working day without being recharged.
Second, these alternative modes of input were considered features rather than the primary mode of interacting with a Tablet PC. Although a few brave OEMs tried to introduce slate designs, the most common tablet configuration was a convertible PC, which functioned as a conventional notebook most of the time and switched into tablet mode as needed. The result was a system that didn't do either task particularly well.
Finally, the biggest problem was a lack of developer support. Even tablet enthusiast had a hard time finding apps that really took advantage of pen and touch input.
And so the entire Windows Tablet PC category was relegated to niche status, selling a microscopic number of units. Within a few months of its release, Apple had sold more iPads than Microsoft had sold Tablet PCs in the preceding eight years.
There's no question that Microsoft learned some painful lessons from the Tablet PC failure. There's also no question that its Tablet PC experience has given it a good head start—at least in technology terms—when it comes to Windows 8. For 2012, its challenge is to prove it can deliver a tablet that people will love. That's a tall order.
Photo: Michael Walsh, The Acer Guy
After XP shipped in 2001, Microsoft got right to work on the next release of Windows. It was an ambitious undertaking. Then-Windows boss Jim Allchin had a long list of groundbreaking features that would go into the upgrade, which was code-named Longhorn.
Paul Thurrott covered the Longhorn project extensively in those early days, putting together a detailed FAQ, multiple screenshot galleries, and extensive coverage of the many times Microsoft excitedly showed off new Longhorn features to developers and partners.
For Longhorn, the high point was the 2003 Professional Developers Conference (PDC), where Microsoft showed off everything it had done so far and whipped developers into a frenzy over what they could do with Avalon and Indigo and WinFS (Future Storage) and Next Generation Secure Computing Base, aka Palladium.
And then the wheels fell off.
In January 2004, Allchin sent an e-mail to Gates and Ballmer admitting failure:
I must tell you everything in my soul tells me that we should do what I called plan (b) yesterday. We need a simple fast storage system. LH (Longhorn) is a pig and I don't see any solution to this problem.
It took a few months, but by August the die had been cast, and the infamous "Longhorn reset" happened. A 2005 Wall Street Journal article has the ugly details:
Microsoft would have to throw out years of computer code in Longhorn and start out with a fresh base. It would set up computers to automatically reject bug-laden code. The new Longhorn would have to be simple. It would leave bells and whistles for later -- including Mr. Gates's WinFS ...
On Aug. 27, 2004, Microsoft said it would ship Longhorn in the second half of 2006 -- at least a year late -- and that Mr. Gates's WinFS advance wouldn't be part of the system. The day before in Microsoft's auditorium, Mr. Allchin had announced to hundreds of Windows engineers that they would "reset" Longhorn using a clean base of code that had been developed for a version of Windows on corporate server computers.
Nearly three years of work went down the drain, and a demoralized development team had to kick into high gear to turn out Windows Vista two years later. It's no wonder that Vista, despite its excellent foundational work, was a mess when it shipped.
Screenshot credit: Paul Thurrott
One of the great failings of Windows XP was a default security model that gave the primary user account full administrative powers over the operating system.
In its documentation for IT professionals, Microsoft recommended that administrators configure standard accounts for users, to limit the amount of damage they could do if they were tricked into installing a malicious piece of software. But many Windows programs were written under the assumption that the user had full administrative privileges and wouldn't run under a standard user account.
So, for Windows Vista, Microsoft decided to get serious about tightening the screws on user account permissions. In the process they went too far, alienating users and creating the single most mocked, misunderstood, and despised Vista feature of all: User Account Control.
During the darkest days of the Vista era, I wrote a lot of posts about UAC. including one extremely popular set of instructions for taming UAC. That post included this succinct description:
The biggest misconception I hear about UAC is that it’s just another silly “Are you sure?” dialog box that users will quickly learn to ignore. That’s only one small part of the overall UAC system. The point of UAC is to allow you to run as a standard user, something that is nearly impossible in Windows XP and earlier Windows versions. In fact, with UAC enabled (the default setting) every user account in Windows Vista runs as a standard user. When you try to do something that requires administrative privileges, you see a UAC consent dialog box. If you’re an administrator, you simply have to click Continue when prompted. If you’re running as a standard user, you have to provide the user name and password of a member of the Administrators group.
What went wrong? For starters, there were way too many consent prompts—some of them in a cascade for a what should have been a simple task.
And it didn't help when a Microsoft executive publicly and proudly admitted that the point of the feature was to "annoy users." David Cross, a product unit manager at Microsoft, made that admission in a speech at a security conference:
"The reason we put UAC into the [Vista] platform was to annoy users — I'm serious," said Cross, speaking at the RSA Conference in San Francisco on Thursday. "Most users had administrator privileges on previous Windows systems and most applications needed administrator privileges to install or run."
Cross claimed that annoying users had been part of a Microsoft strategy to force independent software vendors (ISVs) to make their code more secure, as insecure code would trigger a prompt, discouraging users from executing the code.
That might have been literally true, but the subtlety was lost on exasperated Vista users, who felt personally offended at being used as human targets in a sniping war with third-party software developers.
Microsoft toned down UAC dramatically in Windows Vista Service Pack 1 and gave it a complete overhaul in Windows 7. And the bad publicity did indeed shame the most egregious software offenders into cleaning up their act. But the damage was done. Today, UAC may be far less annoying, but its reputation has never fully recovered. Microsoft learned a key lesson: features with this much disruptive potential need to be designed carefully from Day 1.