Once several million PCs moved into offices it immediately became apparent that some better means than "sneaker net" (carrying floppies between machines) needed to be found for users to be able to share access to documents.
At that time (early 1986) the choices were:
- The PC-AT could be connected to mainframe networks using an SNA board to act as a terminal and remote job entry station but there was little ability to use the mainframe as a file sharing switch for PCs.
- Running Xenix on the PC gave it access to Unix networking (by then well established in both the already declining Usenet/UUCP form and the growing internet form) using either serial or TCP/IP connections.
Adding Xenix was expensive, network boards were typically about $1,195, Microsoft's Xenix ran to $795 for the base package, and the machine needed additional RAM at $1,125 for the 512K upgrade.
The pre-configured IBM PC-AT based multi-user system ran around $13,995.
Equivalents from companies like Altos, Tandy, DEC, and NCR were considerably cheaper and ran standard Unix, but people generally bought the PC first and only then discovered networking and multi-user issues.
- Switching to the Mac, where Appletalk and Appleshare were widely used and highly effective; or,
- Adopting Novell software on a "server" with Novell network cards and client software on the PCs.
Novell cards and software worked with PC-DOS in the standard PC-AT and so quickly gained market share, reaching 90% of the market by 1989 with IBM's token ring based LAN manager taking most of the rest.
The basic Novell software worked extremely well in part because it used a networking technology that was just a marginally modified form of TCP/IP, and in part because it did very little - it merely turned a server into a shared disk and printer manager accessible by users on network attached personal computers. That, however, was enough to turn them into a billion dollar company by mid 1990.
The response IBM and Microsoft co-developed was a networking technology known as NetBEUI (NetBios Enhanced User Interface) which extended the PC's BIOS, stored in ROM and run at boot time, to make it "network aware" and so laid the foundations for a PC network architecture known as SMB/NetBIOS that is still in use in many corporate Microsoft Windows installations today despite having been officially deprecated years ago.
File and print sharing through a server led to demand for more shared services - particularly shared databases and, from this, the PC version of client-server (processing on the client, data on the server), started to evolve.
At about the same time, however, there was a growing rift between Microsoft and IBM. The success of GUIs on Atari, Macintosh, and Unix computers had demonstrated their potential and, of course, both IBM and Microsoft wanted to own the inevitable PC version.
As a result Microsoft, which had developed a partial windowing environment modelled on a GEM application for the Tandy 2000 (the only 1983/4 Intel machine with the power to run anything remotely like a GUI, in this case using a 12Mhz 80186) but dead ended it, revived that project and issued a partially graphical interface for DOS 3.1 called Windows 2.1 in December of 1987 - just over three years after announcing it and four years after abandoning Windows 1.0.
At that time Microsoft also released Word and Excel for the Macintosh as well as a version of Excel for the PC combining the application binary with a limited set of run-time graphics libraries to produce a Windows-like effect when the application was launched from MS-DOS. Sales of this product were extremely limited in large part because the PCs of the time were unable to run it effectively - leading many Microsoft reviewers and evangelists to stage their Word and Excel demonstrations on Macs.
By mid 1988 Computerworld and most of commentators outside the PC press were agreed that IBM's forthcoming OS/2 signified the end of Unix and would own the desktop by 1990. When OS/2 release 1.1 shipped in December of 1988, it proved to be a superior product, but most of Microsof's applications wouldn't run on it, and most of the clone makers turned out to have ironclad contracts with Microsoft preventing them from adopting it.
As a result IBM was left without a market for OS/2 outside its own loyalists in the mainframe community.
When Microsoft Windows 3.0 shipped on May 22, 1990 it turned out to be the same kind of MS-DOS application the December, 1987 Release 2 had been; but with one significant difference: essentially all of the internal APIs and libraries had been changed.
As a result work done by leading software developers like WordPerfect, then the market leader in office automation, on Windows 2.1 had to be written off and their applications rebuilt from the ground up to work with the new APIs and other components. Microsoft, however, had ported its Word (introduced in 1984 on the Mac) and Excel (introduced in 1985 on the Mac) products from the Macintosh to the PC under Windows 3.0, and so had a substantial head start that gave it dominance of these markets before competitors could get their products out.
|Enter Windows NT|
| Windows NT (New Technology) started out to be a version of Windows 3.0 which did not require MS-DOS to run and was initially known and tested as Windows 4.0. This product failed because many PC machines wouldn't boot it, although the core elements were eventually morphed into Windows CE.
When IBM's OS/2 work seemed to threaten Microsoft, the NT project was revived. Dave Cutler, a key VMS designer from DEC was given more latitude, the product renamed NT 3.1 to surpass IBM's pending OS 2, release 2.1, and development re-started on what was then intended as a kind of second generation VMS.
Unfortunately the fastest Intel based PCs then available (the i80486 was state of the art in late 1992 and early 1993) were too limited to run it even with additional memory and this too failed in the public beta. NT Version 3.51, a stripped down variant designed to accommodate processor and memory limits, worked reasonably well and gave rise to an NCD software project that was later to become the basis for Citrix Systems, but was generally so slow and unstable an OS that public support for Microsoft started to erode.
NT 4.0, really little more then a port of DEC's VMS to Intel, was released in July of 1996 as a substitute and was the first Microsoft OS to fully integrate TCP/IP capabilities. This was an immediate hit as critics of 3.51 scrambled to get back on board with Microsoft and the usual publications made the usual announcements about the death of Unix.
The hype was so overblown that serious people predicted 30% annual growth for Microsoft until past 2010 while others trumpeted the end of brick and mortar retailing at the hands of the Microsoft E-Commerce machine. Unfortunately what the enthusiasm really led to was 500:1 price earnings ratios for some internet companies and billions in later losses for investors.
When Windows 3.1 appeared in April of 1992 it was a place holder for the much more significant release 3.11, delayed until October of 1992. This later version included the SMB (server message block) based peer to peer networking promised for 3.1 and changed the industry. Windows 3.11, known as Windows for Work Groups, brought local area network file sharing into the Microsoft product line and was therefore the first product to nail down the Microsoft client-server paradigm and so set the stage for the all Microsoft Office environment we often see today.
Windows for Workgroups, like Windows 95, 98, and ME, ran as an MS-DOS application, thereby maintaining backward compatibility with older hardware and inadvertently re-creating the two tiered architecture of the original CP/M product - this time with MS-DOS acting to serialize user input coming from applications running under the graphical shell.
|SQL stored procedures|
| Every database application has to have some logical processing, at a minimum to format data, but more commonly to do fairly complex things; for example, multiplying hours worked by hourly rates to get gross wages due.
In mainframe and mini-computer applications this logic is handled in the application code outside the database because that is the simplest and most natural way of doing it. Underlying that design, however, is the assumption that the operating system will resolve database serialization and other multi-user issues to make it possible to share one application among many users.
In the original Microsoft client server world you had many users, but no multi-user operating system. Since serialization had to be handled somewhere, it devolved to the database engine which then had to handle application logic as well as data storage and retrieval. The stored procedure idea was developed to accommodate this.
In a payroll example, an application request would start a script written in SQL (Structured Query Language, see Databases in Chapter 5) and executing within the database management system's memory area. This script would create a temporary payments table, read the hours from one table and rates from another, do the multiplication, store the results in the temporary table, and then send the requesting application client the contents of that temporary table.
Operationally, stored procedures tend to be extremely inefficient, but are the only option in a PC client-server model where the database is the only multi-user component.
Notice that because all connections are handled via TCP/IP, the database running the stored procedure need not be on the same machine as the one storing the data - it only needs to be the one the application connects to first.
As a result a three tiered database access architecture - PC client, stored procedure server, and data server, quickly developed to support multi-user applications on Windows sized servers.
The local area networking capabilities within Windows 3.11 were severely limited, but adequate for the simple file and print sharing characteristic of most small offices.
Third parties, including Sun Microsystems, then ported what became known as TCP/IP "stacks" (because the software consisted of multiple layers) to Windows 3.11 to give it some internet access capabilities.
As a result Windows LANS became Windows WANS (wide area networks) that ran parallel to the LANS which, of course, continued to use SMB/NetBIOS networking. In combination these facilities enabled departmental workgroups to share files, obsoleted the previous "sneaker net" approach, gave users access to internet resources, and quickly enabled the development of stored procedure execution engines within client-server databases.
In this configuration a number of Windows PCs running a Windows specific piece of software known as a client connect to the database using TCP/IP. The database engine then handles all required serialization and is, therefore, the multi-user component that turns a collection of single user PCs into a multi-user system.
Oracle was the first major company to support this approach to multi-user access for isolated single user systems, developing and selling its own TCP/IP "stack" and related tools, known as SQL-NET, to do this. Sybase, however, took the next step - that of optimizing the database for stored procedure execution rather than database management, and so quickly became the pre-eminent PC database supplier.
There have been thousands of changes to the Windows architecture since, but none are technically significant. Today's latest Windows for Workgroups LAN is implemented using vastly more powerful hardware running Windows 2003 instead of Windows 3.11, uses TCP/IP instead of SMB/NetBIOS, and connects to a client running Windows/XP instead of Windows 3.11 - but is fundamentally the same in terms of its structure and operation.
- These excerpts don't (usually) include footnotes and most illustrations have been dropped as simply too hard to insert correctly. (The wordpress html "editor" as used here enables a limited html subset and is implemented to force frustrations like the CPM line delimiters from MS-DOS).
- The feedback I'm looking for is what you guys do best: call me on mistakes, add thoughts/corrections on stuff I've missed or gotten wrong, and generally help make the thing better.
Notice that getting the facts right is particularly important for BIT - and that the length of the thing plus the complexity of the terminology and ideas introduced suggest that any explanatory anecdotes anyone may want to contribute could be valuable.
- When I make changes suggested in the comments, I make those changes only in the original, not in the excerpts reproduced here.