X
Tech

The Pentium at 10

The Pentium will soon be ten. It's been a decade of design improvements seasoned with a couple of dead ends.
Written by Rupert Goodwins, Contributor
COMMENTARY--They're getting the trifle and party hats ready at Intel; the marketing department may even be providing some clowns.

The Pentium will soon be ten. Announced in March 1993 and shipped in June, it's made Intel a brand synonymous with home computing.

Not that today's Pentium 4 is anything like the original hardware. The Pentium's decade has been one of design improvements seasoned with a couple of dead ends. Intel shouldn't forget the Pentium Pro, for example, which ran new software well and old software very badly: a mistake that may yet be echoed in the Itanium. Yet although a Pentium today can run software a couple of hundred times faster than the original part, from a software point of view many of the core ideas remain the same. Something that Intel got right is the sort of security feature that could have made our computing lives much less frustrating and more productive -- if only the software had the gumption to use it.

One of the curiosities of digital computers is that at a very deep level, data and programs are indistinguishable. A pattern of ones and zeros might be a picture of a cute kitten if stored as a graphics file; the same pattern has an entirely separate life if you tell your processor to decode it as a program. Virus writers and worm wranglers love this dichotomy: if they can sneak something evil past you because it looks harmless, they've got you.

Mainframe designers have known this since the 60s. There weren't any viruses or worms back then, but there were professional programmers writing commercial software -- which can be just as dangerous an activity. Now and again the wrong figures got fed into a sum, or there was a slip in the design or implementation of a piece of software. The results were varied, but sometimes it meant that a chunk of data ended up in the wrong part of memory, one where the computer was running its programs. The computer dutifully tried to execute Mrs. Bloggs' gas bill, and collapsed in a twitching mass of flashing lights.

Mainframe makers, sensitive to the needs of the huge corporations and their multimillion dollar deals, worried a lot about such things. These were the days when computers weren't supposed to crash. One of the solutions they came up with was protected memory, where the parts of the computer that did the processing would only run code from memory that was marked as being allowed to hold executable programs. That area would not accept new data during normal operations: if an errant program tried to overwrite it, the computer was safely diverted to a piece of software designed to cope with the problem. Also, areas of memory were marked as safe for data -- but only if written within certain boundaries. Any attempt by bad software to write data outside those boundaries was also safely denied.

It took a while for these ideas to become standard--there are performance implications, and both operating system and application software design needs to fully incorporate protected memory for it to be effective. However, by the 1970s they were seen as essential for any serious computer.

Ten years later, the mainstream microprocessor caught up. With 1986's 80386, Intel added workable protected memory hardware alongside many other ideas that were adopted wholesale into the Pentium six years later. For the first time, operating system writers could create software that would run on an Intel system and behave more or less like a grown-up computer. It didn't take long to move various Unixes across, and some people confidently predicted at the 386's launch that the forthcoming MS-DOS 5.0 would adopt the new, long-trouser model of secure computing. It didn't: neither did the first few versions of Windows.

The Pentium's vastly improved performance only increased expectations, but in vain. Even by 1996, Microsoft's flagship consumer operating system lacked bullet-proof memory protection. A deep desire for compatibility with previous applications software features at all costs, and performance before security kept things back, although as the de facto monopoly the company could easily have forced a greater rate of change. It certainly didn't mind inflicting other unpleasantness on its users, as was apparent to anyone who noticed that you used Windows 95's Start button to stop the software

Things are finally changing. Microsoft's .Net language, C#, follows the lead of Java and knows how to partition programs and data so that they can be guaranteed not to mess with each other. Server 2003 comes locked down tighter than a Silicon Valley executive's expense account, and isn't scared of showing badly-behaved legacy applications the door. XP has successfully merged some of the security ideas introduced with Windows 2000 with the untrammeled madness of the Windows 9x desktop.

If only the application software showed a similar bout of maturity. That version of Outlook server you're using, for example, is one of the messy, rule-breaking applications that won't run on Server 2003: it remains unequal to the task of fending off the spam, virus- and worm-infected torrent from the great outdoors.

Microsoft's response has been to point to the bright new future of the technology formerly known as Palladium, NGSCB, which will shut down anything that isn't approved from on high. Well, let's see.

It would be nice to say that, ten years on, the Pentium's solid security model backed by constantly increased performance has been equaled by the software that runs on it, but most of those ten years have been marked by wasted opportunities and questionable decisions. Send in those clowns.

Rupert Goodwins writes for ZDNet UK.

Editorial standards