The competition to make history was tough in 1981: Ronald Reagan was sworn in as president, the US-Iran hostage crisis ended, the economy slid into a recession, and scientists discovered a mysterious disease that would later be known as AIDS.
Few could have imagined that an instrument of revolution was sitting innocuously on the shelves of local Sears Roebuck stores. Twenty years later, however, it's clear that the personal computer has changed life in modern society, from simple "word processing" to the introduction of the Information Age.
"I can hardly think of an element of society that hasn't been changed by the PC," said Charles B Kreitzberg, CEO of Cognetics and an expert on the relationship between people and computers. "Corporate culture has begun to shift into a new paradigm. The World Wide Web links us all in a way never possible before. None of this could have happened in the same way without the PC."
IBM's introduction of the 5150 PC on August 12, 1981, has been viewed alternately as a stroke of brilliant technological foresight and the biggest business blunder of the 20th century. Either way, it is safe to say that the world would be a vastly different place--though not necessarily a better one--had IBM not jumped into the PC market.
From a business perspective, the significance of the move has been dramatic. The introduction of the IBM PC, a machine that sold for as little as US$1,565 with 16 kilobytes of memory, transformed the founders of little-known companies Microsoft and Intel into billionaires and made software the most lucrative product in corporate America.
And the influence of the personal computer far transcended the computer industry, profoundly altering how people interact. It turned solitary nerds into perceived geniuses, and it paved the way for widespread acceptance of the Internet, which exponentially expanded communications on a global scale.
As it became a staple of the workplace, however, the new machine also carried some ugly consequences.
It created a cubicle culture of partitions and closed spaces across the floors of office buildings, providing a new excuse to avoid face-to-face interaction in a society already heavily divided along ethnic and socioeconomic lines. Some say the PC's arguable contribution to productivity contributed to a workaholic ethic in which many professionals feel tired and unhappy.
Two decades later, the personal computer is still changing culture. More than half of all US households have a PC, enabling the Internet to evolve from a government and academic niche to a mainstream communications medium. Therein lies its lasting legacy and greatest historical significance to technology veterans and sociologists alike.
"Just as we could have rode into the sunset," Intel chairman Andy Grove said Wednesday in San Jose, California, at a 20th anniversary party for the IBM PC, "along came the Internet, and it tripled the significance of the PC... It doesn't stand alone, and it has to be connected. Its utility now comes from what it's connected to."
As the media focuses on IBM's role in the phenomenon, it is interesting to note that Big Blue did not invent the desktop computer and has never claimed that distinction.
A renegade Silicon Valley start-up introduced a desktop computer, the primitive Apple I, in 1976. RadioShack had been selling TRS-80s (derisively dubbed "Trash-80s") since 1978. Computer scientists had been working on incremental ways to reduce the size of mainframes for more than a decade, though few believed consumers would buy such devices.
IBM did not even coin the term "personal computer". Hewlett-Packard claims those bragging rights, pointing to an advertisement from October 4, 1968, in Science magazine. The ad introduced "the new Hewlett-Packard 911A personal computer", a US$4,900 calculator that could handle magnetic cards.
Few computer aficionados were overwhelmed by IBM's first attempt at making a PC. The 1981 edition came with a 4.77MHz Intel 8088 processor, 16 kilobytes to 256 kilobytes of memory, and an operating system called DOS 1.0. IBM managers used off-the-shelf parts, an operating system from Microsoft and chips from Intel, and they marketed it through independent distributors such as Sears and ComputerLand.
At the time, people who used PCs--mainly academics and scientists conducting mathematic research or laboratory simulations--said the IBM machine was inferior to the polished Apple II or minicomputers such as the Programmed Data Processors (PDP-11 and PDP-10) from Digital Equipment Corp (DEC) and the Virtual Address eXtensio (VAX).
They complained that DOS 1.0 offered much less than a smoother, faster operating system called Control Program/Microcomputers (CP/M), created by Gary Kildall, a computer science teacher at the US Naval Postgraduate School. Kildall, buddy of the young Apple creators in Cupertino, California, created CP/M in 1972, nearly a decade before the IBM PC.
David Jones, professor of computer science at McMaster University in Hamilton, Ontario, still has the IBM PC that his father-in-law, an IBM veteran, gave him in 1981. As a first-year undergraduate at the University of Western Ontario, Jones spent US$1,500 for his first PC--a CP/M with two floppy disk drives and 128 kilobytes of RAM--a few months before receiving his IBM gift. He overwhelmingly preferred the CP/M.
"To a nerd like me, a computer science undergrad student who had played with Apples and CP/Ms, the IBM was technologically very unimpressive," said Jones, who went on to receive his PhD in computer science from Stanford University and is now president of Electronic Frontier Canada. "IBM outsourced it. Many technological people recognized this. They said, 'Look, this is just a vanilla machine that's not very impressive or superior to anything else.' It didn't have a great operating system or a great deal of software."
Even IBM didn't fully believe in the little machine. Senior executives greatly underestimated demand, originally planning to sell 241,683 PCs from 1981 to 1986. Instead, it sold about that many in the first full year and 3 million during the first five years.
"Part of it was just being conservative--the IBM culture," said David Bradley, who joined IBM in 1975 and was one of the original 12 engineers on the PC. "And the other part of it was, who understood the PC business in 1981? You had a company named after a fruit, the Trash-80 selling to hobbyists--either this was going to be the next big thing or it was going to be nothing."
With IBM's help, it became the next big thing. Big Blue lent a legitimacy that a group of 20-somethings in Cupertino could not. With the stamp of one of America's most respected brands, backed by a powerful marketing campaign, the IBM PC became a machine that people believed would change the world. Time
magazine named the computer the 1982 "Man of the Year". In the article, Time
said the PC could "send letters at the speed of light, diagnose a sick poodle, custom-tailor an insurance program in minutes, test recipes for beer".
The article cited a poll in which 80 percent of Americans said they expected home computers to be as common as television sets or dishwashers.
IBM's entry into the PC market triggered an explosion in sales. In 1980, about 25 fledgling computer companies sold 724,000 PCs for US$1.8 billion, according to industry research firm Dataquest. In 1981, when IBM joined the fray, nearly 50 companies sold 1.4 million PCs for nearly US$3 billion. In 1982, roughly 100 companies sold 2.8 million PCs for nearly US$5 billion.
"I don't think IBM's entry made the difference, but it certainly was a statement that computers for individual people made sense," said Douglas A Davis, professor of psychology and an expert on the relationship between humans and computers at Haverford College. "That was something not everybody believed at the time."
In addition to flexing its marketing muscle, IBM fueled the PC market through a business strategy that academics and business leaders still debate as a masterstroke or a miscalculation: It licensed the operating system from Microsoft and the chips from Intel, permitting the bit players to sell their products to other companies.
In doing so, Big Blue made Bill Gates and Andy Grove phenomenally rich, and it spawned an industry of PC "clones" that would soon outpace IBM itself.
Some of the first to seize on that concept were Jim Harris and Rod Canion, Texas Instruments engineers who in 1982 ambled from their Houston offices to the nearby House of Pies restaurant to discuss a business plan for selling IBM clones. They emerged with a series of drawings depicting how their PCs should look, scribbled on a paper placemat because they forgot their notebooks. Later that year they founded Compaq Computer.
The key to Compaq's success was the reverse-engineering of one of the few pieces of proprietary software in the IBM PC--the basic input/output system (BIOS). Without knowing the BIOS, clones couldn't effectively run popular software such as the Lotus 1-2-3 spreadsheet.
"It must have worked," Canion laughed, recalling the laboratory detective work. "We got our funding."
In fact, Compaq set a record for the largest first-year sales of any American business. It reached the Fortune 500 list and surpassed US$1 billion in revenue faster than any other company. Nine years after Compaq was founded, it employed more than 10,000 people and operated in 65 countries.
Hundreds of clone makers jumped into the market, stealing customers from IBM with lower prices and more attractive machines. By 1986, more people bought clones than they did IBM PCs.
Clones weren't Big Blue's only problem. The company, which unveiled its first computer in 1952 and enjoyed a 70 percent market share in the '60s and '70s, was also struggling to shake off the debilitating yoke of a federal antitrust investigation. In January 1969, the government began a crusade to break it into smaller companies that would compete against one another.
From 1975 to 1980, the parties called 974 witnesses and read 104,400 pages of transcripts, according to Emerson Pugh's 1995 book "Building IBM: Shaping an Industry and Its Technology." The 13-year investigation required IBM to retain 200 attorneys. The government abandoned the effort entirely in 1982--largely because clones had eroded Big Blue's dominance--but IBM staggered out of the courtroom with a shaken confidence; it was further slowed by a rigid corporate hierarchy that pre-dated the upstarts in Houston and the Silicon Valley.
Preoccupied with the antitrust fallout and overcompensating for long-standing monopoly allegations, people who developed the IBM PC said they didn't consider the ramifications of Microsoft, Intel or clones. After the antitrust ordeal, IBM wanted as many people as possible to participate in the market.
"I never thought about any of that," Bradley said Wednesday. "Maybe I should have. But frankly, our goal was to have everyone in the industry participate. From that standpoint, we got exactly what we planned."
In 1987, as its market share eroded, IBM tried to recapture sales by creating the PS/2 computer, built with a proprietary operating system called OS/2 and hardware dubbed Micro Channel Architecture. But the PC industry would not support the new technology as a standard, and by early 1995 IBM had halted PS/2 development and begun to back away from the OS/2 in the consumer market.
The resulting "open architecture" environment had legions of supporters--albeit few on Wall Street. IBM's stock plummeted throughout the late 1980s and early 1990s, hitting about US$10 per share at its nadir, the summer of 1993. (It now trades at more than US$100 per share, based mainly on the strength of revenue from notebook computers, services and popular software for e-commerce companies.)
Computer scientists and other academics, who don't view the world through a purely financial lens, said open architecture was a good thing, regardless of what it did to IBM's fortunes. Had all computer companies been dictated by any single technology standard, the PC would be neither as affordable nor as useful as it is today, they say.
"I frankly think they did the world a big favor," said computer engineering professor Ken Kennedy, director of the Center for High Performance Software Research at Rice University. Kennedy was co-chairman of the President's Information Technology Advisory Committee and worked on supercomputers at IBM in 1978-79.
"I'm not a fan of all of Microsoft's business practices, but they showed the world you could actually make money selling software. They changed computing from a hardware to a software business," Kennedy said. "Little companies in garages sprung up all over the world making software. Software is what gives the box its real character, and we have IBM to thank for that."
The consequences beyond the industry are equally debatable. Although PCs make many tasks easier, some say computers have raised the stakes for catastrophe and have even paved the way for disaster.
At the 20th anniversary celebration Wednesday, Grove compared the PC to the automobile and the atom bomb--forces that have proven both beneficial and lethal.
"Computers cut both ways," said Davis, the Haverford professor. "Having patient records available at the click of a mouse saves lives at a hospital with electronic medical records. At the same time, when the computer goes down, you can't get anything done at all. You may be routinely saving lives all the time here and there, then you could have a sudden, catastrophic accident where the hospital's computer system is down. That possibility didn't exist before computers."
Others say the PC, for all its so-called advances, is still a machine that elicits frustration and infuriation more often than serenity and joy.
A study released in May confirmed that technology--from PCs to pagers--tires and annoys many US workers. Two out of five employees surveyed said they had to use their PCs or related devices during nonwork hours, causing them to fantasize about different jobs that are less reliant on technology.
Even industry veterans concede that the PC has, in many cases, eroded the quality of life. As a founder of pioneering spreadsheet company Lotus Development and teacher of transcendental meditation, Mitchell Kapor is uniquely qualified to speak on all facets of the complicated subject.
"It is terrible what small businesses go through," Kapor said. "The stuff is just too hard to do by yourself. People think they're stupid because they can't get stuff to work. People scream at their computers every week."
Frederick Kohun, associate dean of the School of Communications and Information Systems at Robert Morris College, was more blunt. He said that the PC, combined with the Internet, has made people "dysfunctional."
"We've got too much information and not enough tools to sort through it," Kohun said. "Some people require lots of information to make a decision; others make great decisions on minimal information or summaries. Either way, we have way too much information right now. We can't process it all. That's what the PC did for us."