With the release on Wednesday of its next major operating system milestone -- Release Candidate 2 of Windows 2000 -- Microsoft acknowledged it has moved its component load balancing component out of the base product.
But Microsoft claims that's the only piece it has eliminated from RC 2 and that cluster-aware applications will not be impacted negatively by this packaging decision.
Microsoft maintains that Windows 2000 is in the final fit-and-finish phase at this point, with no new features being added or eliminated. Microsoft is still targeting the end of 1999 as its delivery target for the much-delayed operating system. Partners say the company recently slipped its expected release-to-manufacturing date from November to December. Starting Wednesday, the company will begin burning RC 2 CDs, which it will supply to its 650,000 Beta 3 testers. The first RC 2 CDs should start reaching testers next week.
The changes to RC 2 that Microsoft says it has made since RC 1 include new software and hardware drivers, improved management tools, an update to Windows Media Services and support for Microsoft's Web Telephony Engine (the successor to Microsoft Web IVR product for creating and running telephony applications).
With RC 2, Microsoft released the system requirements for its forthcoming Windows 2000 SKUs. Microsoft is recommending a 133MHz or higher Pentium, 64MB of RAM minimum and 650MB of free disk space to run Windows 2000 Professional. For Windows 2000 Server, it is recommending a 133MHz Pentium or higher CPU, 256MB of RAM and 1.0GB of free hard disk space. For Advanced Server, Microsoft is recommending a 133MHz or higher CPU, 256MB of RAM, and 1GB of free hard disk space.
COM+ load balancing is a feature that provides distribution across multiple CPUs of business logic at the middle-tier level. The component load balancing feature of COM+, the object plumbing used by Windows 2000, was in RC 1 of Microsoft's Advanced Server SKU. It also was expected to be part of the high-end Windows 2000 Datacenter SKU, which still has yet to go to beta test. For beta testers who need access to the facility, Microsoft is providing COM+ load balancing as free, downloadable option from its Product Support Services Division.
Microsoft denies that it pulled the COM+ load balancing technology because it was unstable or unfinished. "Beta testers said they need the ability to better manage it [COM+ load balancing] and deploy apps using it," says Michael Gross, Microsoft COM+ product manager. Gross also denies that Microsoft's decision to move COM+ load balancing means the company is lessening its commitment to Windows 2000 clustering in any way. He says Microsoft will still deliver, as promised, Advanced Server and Datacentre, the network load balancing technology based on the WLBS product Microsoft acquired when it bought Valence Research last year.
Microsoft still will provide the ability to balance Web server and e-commerce front-end traffic via this network load balancing capability. It also will deliver two-way clustering in the base Advanced Server SKU and four-way clustering in Datacenter, as it has said it will do, Gross says.
With the latest tweak, is Windows 2000 finally ready to go? The answer, according to Microsoft senior vice president Jim Allchin, is yes. In an interview with ZDNet sister publication Sm@rt Reseller, Allchin notes, "We will not RTM until we have proven that Windows 2000 is more reliable than NT 4 with SP4/5 and all customer production deployments are proven out."
How will Microsoft achieve these lofty goals? Allchin explains the company is "running a stress test each day covering thousands of machines which randomly tests deeply all parts of the system.
"Second," he adds, "we have long running duration tests where we measure server activity. Third, we measure Uptime precisely on production machines running different versions of NT. We do this within our own IT organisation (e.g., Web servers, file services, directory servers, etc.) as well as partners who have allowed us to run Windows 2000 on their servers. This is done on hundreds and depending on the build sometimes thousands of production machines." From this feedback, Microsoft knows "of every reboot, how long it was down, and for what reason," Allchin claims.
"Fourth," he continues, "we are using a variety of sophisticated new tools to analyse the source code for issues. These tools have uncovered latent problems that have existed that are not fixed in current versions of NT.
"Fifth, we have a dedicated penetration team that has done extensive code reviews of the system investigating potential security issues. We have also hired outside firms to do this." These problems are then analysed, addressed, and fixed. Not every application's troubles with the new operating system can be taken care of in this manner.
"Server application compatibility is also in good shape," Allchin says. "However, we have uncovered some application errors that we can't work around -- because they are reliability issues or security issues. In those cases we have worked with the ISVs (independent software vendors) to provide a patch."