Thinking about the computer industry from a 20-year perspective makes my head hurt, a lot. I can still wax nostalgic, but it takes more than three cups of coffee on a Sunday evening to pry the memories from the inner recesses of my 41-year-old brain.
For the technology industry, particularly as it relates to information technology and personal computing, 1991 was a year of transitions. By most accounts, nothing particularly important happened in 1991 per se that you could nail on a board that created a watershed event that we are still living with today.
(Okay, the Web was first turned on by Tim Berners-Lee at CERN in 1991. But literally, there was nothing on it at the time. And yes, Linus started work on Linux.)
However, the year that preceded it, 1990 and year that followed, 1992, are particularly notable. Windows 3.0 was released the year before, and Windows 3.1 would be released in the next.
Still, the transition itself -- to use the title from EMF's 1991 hit song, was unbelievable, in the sense that a storm was brewing that would eventually change everything.
That storm was the Microsoft and the GUI storm that would eventually bring us to the computing model we are using today. If you believe the Steve Jobs iPad snake oil, the era that which probably began in 1991 -- The Golden age of PCs and Microsoft Windows -- is the one that is now coming to its end.
You'll have to excuse me while I stare up in the sky with my reptilian eyes and look for the giant Apple-shaped fireball heading this way.
During summer jobs, I also worked with small minicomputers like the DEC PDP-11 and the multiuser XENIX-based Altos 386. When I was in school, I was even able to get my hands on powerful graphical workstations like the Apolloand the NeXT, which was way ahead of its time.
But now it was time to hunker down and become an adult.
One of the first jobs I had was being a bench tech for a large retail consumer electronics and computer store in Yonkers, New York. People would buy computers and need software installed on them. We also repaired systems.
So this exposed me to an awful lot of stuff out in the wild.
This was the same year CompUSAstarted selling computers in retail, and had just begun to displace ComputerLand, a 1980s-era retail computer chain.
It really strains my brain when I think about the systems that were typical of that day. We were still on the Intel architecture, as we are today, but the state of the art chip generation at the time was the i486 (80486). Back then, clock speeds of CPU's were still measured in Megahertz (Mhz).
So the fastest PC at the end of 1991, a 80486DX was 50Mhz, could execute about 40 million instructions per second and had a peak dhrystone MIPS output of 50. It had about a million transistors on the die, which was huge achievement for the time.
That was on the very high end of the PC scale.
If they were up to date, companies were running 386-based systems, which ran at 33Mhz, 25Mhz or less. And by and large, most places were still using 5Mhz or 10Mhz 8088s like on IBM PC-AT's and original IBM PC's. If you were doing engineering and CAD work, maybe you bought a 486.
A friend and colleague who was at NASA at the time told me they were just starting to replace their original PCs in the mail room with 12Mhz 286 systems in 1991.
You'll notice if you read the various other ads in the rear of those Infoworld issues there was a ton of pricing games going on with the "White Box" 2nd-tier and 3rd-tier vendors to try to squeeze out the margins. Today, none of that mishegas really goes on because PCs are heavily commoditized as consumer products and there's very little room for pricing games even with business-class systems.
Compare these systems with today's fastest Intel Core i7 desktop chip, clocking at 3.4Ghz, with 731 million transistors on it, running at 159,000 MIPS. The 486DX 50Mhz was over 700 times less dense, transistor-wise, and over 3000 times less powerful than what you can put on your desktop today.
And the amount of memory and hard drive space on these things? Well, if you wanted to run Windows 3.0 halfway decently, you needed at least a Meg of RAM. That would be a Megabyte, yes. And if you really wanted to make it fly like a rocketship and run a ton of apps on it, you'd need like 4 Megs of RAM, if you wanted to run say, Word, Excel and PowerPoint at the same time.
Typical IDE hard drives of the day had 20Meg, 40Meg or 80Meg capacities. If you were a serious power user, you might have a 100MB hard drive. Or 200MB, But then you were in SCSI territory, big bucks.
CD-ROM technology was just in its infancy, and we were using 1.2MB and 1.44MB floppy disks to distribute software. A network operating system like NetWare 3.11 might come on nearly thirty floppy disks. Windows 3.0 came on six or seven floppies, as I recall, not counting the DOS install disks.
Very few PCs had CD-ROM drives and multimedia software was nearly non-existent on the PC platform. Impressive sound and graphics was a niche that belonged to the Macintoshes, the Amigas, and the Atari ST's of the world.
Gigabytes? That's the sort of storage you'd think about an entire company having online. It was unfathomable, from the perspective of a 1991 PC user. Enterprise minicomputers and UNIX boxes had hundreds of megs, maybe gigs of data, on SCSI-2 hard drives.
To put this in perspective, the typical Android, Blackberry or iPhone smartphone that somebody carries today is at least 40 or 50 times more powerful than the most decked-out PC that was sold in 1991. My 600Mhz first-generation Motorola Droid, with 16GB of internal storage, considered to be about 2 years behind the times in terms of current smartphone horsepower, would be the stuff of science fiction in 1991.
Heck, If you somehow were able to pass through a wormhole and show me, the 21-year old Jason an iPad, I would have stared at it mouth agape thinking it was a stage prop from Star Trek: The Next Generation(which, for those of you keeping track, was in its fourth season at the time.)
A "Portable" computer of the time period, something like a Compaq LTE weighed six pounds, which had a i386 CPU, a 9.5" display (smaller than the size of the iPad's) and VGA graphics (640x480), had a price of around $4000 and looked something like this:
Compaq LTE 386, Circa 1989-1991
And that was state of the art at the time.Seriously. Ok, enough of that. Suffice to say we got by in 1991 with a whole lot less than we do now and it cost a whole lot more.
The reason why I referred to 1991 earlier as a year of transition is that while we had Windows, most people who were using PC's considered it brand spanking new, as 3.0 was the first version that was considered to be actually usable. Windows wasn't the predominant application environment. No, that honor went to MS-DOS.
While Windows 3.0, arguably the first mass-market release of the product was being sold aggressively and was starting to gain initial popularity -- Microsoft sold 2.75 Million copies of the software in 1990 -- it sold 8 million DOS 5.0 licenses the following year.
In 1991 It was still a DOS world. And everyone knew it.
Excel? Nobody used that for serious spreadsheet work on a PC, it only had a measly 12 percent market share. The best app for doing that was Lotus 1-2-3, or maybe Quattro Pro if you felt like being different. Word for Windows? Are you joking? The leader was WordPerfect 5.2. Presentation Graphics? PowerPoint? Hell no. Harvard Graphics all the way, baby.
And developing apps? Visual Studio? Microsoft C++? What planet are you on? All of that action was owned by Borlandwith their Turbo C++ compiler, who was pushing Object-Oriented programming big time. And as my buddy David Gewirtz will tell you, Mac users were still crowing that HyperCard was the future of programming.
Networking meant running NetWare 3.x and IPX/SPX was the high-performance LAN protocol; TCP/IP was a serious also-ran. In 1991, Microsoft LAN Manager (and its 3com version) were trying to put a dent in the NetWare death grip on the PC networking business.
Network Directory services? Banyan VINES was the industry leader, if your IT people were savvy enough to understand what directory services meant.
And don’t forget that in 1991 people were still using coaxial-based Ethernet; 10BASE-T was still provisional, and 16Mb Token Ring looked like it had a shot at being the long-term performance standard for business.
As to going online and surfing the Web? Hah! I'll let Scott Raymond and Steven J. Vaughn-Nichols educate you a bit on what early online communities were like. Their recollections of the period are fascinating reading.
At the time, Windows 3.0 was a separate product entirely from DOS. It had to be, so you could exit Windows to run your stuff that wouldn't work correctly running in it. And there was a lot of stuff that wouldn't work on it, especially games.
Heck, most people didn't even own mice. You actually had to go out and buy one separately, from Microsoft or IBM if you wanted to run Windows.
A lot of the things we take for granted in operating systems today -- such as the ability to multi-task programs, run background processes and automatically manage memory, didn't really exist in 1991 on PCs. Memory management was an arcane art back in 1991. Heck, we even had to manually configure all our peripherals, like mice, sound cards, serial ports, and modems.
We'd run into things called IRQ conflicts if we didn't use unique interrupts for each device. There wasn't any plug-and-play anything. You had to run drivers in DOS and Windows for every piece of hardware installed in the machine.
And if that wasn't bad enough, we had to figure out how to squeeze each device driver into memory so we wouldn't run out of "conventional" memory in large contiguous blocks. Otherwise certain programs just wouldn't run. Like Windows. Or Lotus. Or any large number of PC games.
Originally, DOS and the IBM PC was only designed to access a maximum of 640K (Kilobytes) of memory. But now users were stressing the limits of what DOS could do. As I mentioned earlier, the 286, 386 and 486 machines could have several megabytes of memory installed in them.
But all that extra memory usable by programs such as Lotus 1-2-3 and Windows 3.0 and various games on the market required entirely different types of memory, at least as it mattered to the programs that used them. Lotus used "LIM" or Expandedmemory, and Windows used Extended memory.
These "DOS Extenders" as we used to call them allowed things like the mouse driver, the sound board driver, the DOS command interpreter, CD-ROM driver, the SCSI driver, and the network drivers to be "Loaded High" (the area in red) and into the "Upper Memory Block" (the area in grey) so that the most amount of conventional memory (the blue area) could be freed up.
You had to free up this memory or literally you couldn't start the application, and it would kick you out with a "Not enough memory" error.
It didn't matter if you had 16MB of RAM installed in your PC. If your conventional memory was too low, you were hosed.
Memory optimizers, such as the ones that shipped with MS-DOS and QEMM allowed us to shuffle the free space around these memory areas so all these drivers and programs -- which we referred to as TSRs (Terminate and Stay Resident) would all fit correctly and not conflict with each other.
The problem was, if you configured your PC's memory a certain way to optimize it for Windows, it wasn't necessarily what Lotus or some game you liked needed to run. So many of us ended up with half a dozen (or more) boot floppies next to our desks that configured the memory a certain way so the programs we used would run correctly.
That meant in many cases exiting the program we were using at any particular moment (let's say Windows or Lotus) rebootingthe computer with a separate floppy, and running the different program. I'm laughing my head off just trying to relate this to how we do and take for granted the things we have today. Unbelievable.
In 1991 DOS was really beginning to show its age. It was a 16-bit OS, but the 386 and 486 PCs themselves had 32-bit architectures. 32-bit and multitasking features as we know them today would not appear in Microsoft operating systems until 1993 when Windows NT was initially released for enterprise use on high-end workstation systems, and not really for consumers until Windows 95 was released, when DOS and Windows became one and the same.
Most consumers didn't get a fully pre-emptive, protected-mode 32-bit OS using Windows NT's technology until Windows XP came out in 2001.
Now, if you were really hardcore in 1991, like myself, you might have gotten hold of one of the beta versions of OS/2 2.0. Then you were really cooking with gas.
IBM's OS/2 2.0 was special. It was a 32-bit operating system that could run in the 386 or 486 processor's protected mode.That means it could do full pre-emptive multi-threaded multitasking. You could download megabytes of data from your favorite bulletin board site in the background, while working on a full-screen DOS or Windows spreadsheet and printing something out on another application at the same time, while a ray-tracing application like Persistence of Vision created cool rendered graphics in yet another window.
It was incredible, liberating even.
You could virtualize a DOS session for each application you needed to run, and have huge amounts of conventional memory left. You could even run multiple spawned sessions of Windows 3.0, each with its own applications and configuration settings running under a "Virtual DOS Machine".
The OS/2 2.0 WorkPlace Shell (1992)
32-bit OS/2 had an object-oriented GUI, known as WorkPlace Shell, that was years and years ahead of what Microsoft ended up releasing with Windows NT, and there are parts of its technology which are still considered by some to be superior to what is running on Mac and Windows systems even today.
In all respects, it was a better Windows than Windows. The problem was, IBM didn't really know how to market it out of a paper bag, and couldn't convince enough ISV's to write native applications for it or get hardware peripheral manufacturers to write enough device drivers for it. So compared to Windows 3.0, and later on, Windows 3.1 and Windows NT, it was a commercial flop, except for certain kinds of customers that required its advanced capabilities.
But none of that mattered to me. I became a well-known OS/2 advocate and evangelist. I did demos at software stores, and formed OS/2 user groups. I even started one of the very first web sites covering OS/2, OS2Web. For about five years, up until around 1996, I wouldn't touch Windows as my primary OS after going OS/2.
OS/2 would become a footnote in the industry as the product which lost the "OS wars". Many professionals and enthusiasts might consider the time they spent with OS/2 as wasted years, devotion to a product that failed, swearing an oath of fealty to an OS empire that would never be.
But not me. Because OS/2 was a specialized skill set, it was my entry into the big IT environments of Wall Street and corporate America over the following years, and which jump-started my career doing the exact same kind of system integration I focus on now at IBM, 20 years later.
Your average home user could never touch, care about or understand the potential of OS/2, and wouldn't spend the money on the systems that could fully exploit it, but big banks and trading floors in downtown Manhattan definitely did.
However, there's another important footnote about OS/2 I want to share with you.
It was because of OS/2 that I was introduced to the woman that became my wife. After a brief exchange of a few emails on the old Prodigy service, she agreed to meet me in person in 1994 when I was doing an OS/2 demo at an Egghead Software store in Paramus, NJ.