Strict implementation/enforcement of standards has proven to be a powerful tool for IT organizations in their efforts to reduce end-user operational costs. To enable this, enterprises must limit the variations within their fleet and maintain a high level of PC hardware consistency.
META Trend: During 2003/04, client standardization and managed build/distribution processes will enable an adaptive and cost-effective end-user computing environment, with focus placed on certification rather than technology homogeneity. Through 2005/06, IT staff will face challenges in managing pervasive client devices. By 2007/08, client computing models centered on IT group device ownership will yield to managed subscription services across corporate and personal devices.
In the current tough marketplace, companies are carefully examining every aspect of their operations, with a specific focus on reducing and controlling costs. The end-user environment has long been seen as a candidate for cost reduction, and indeed, a great deal has been written during the past several years about the total cost of ownership (TCO) for supporting end users. Although companies have made significant strides in reducing the operational and capital costs associated with outfitting their end users, pressure is increasing to squeeze even more out of these segments. Through 2004, we expect the end-user environment to fund more than its share of cost reductions, and we do not expect this pressure to ease up before 2006.
Since 1998, large enterprises have significantly reduced end-user operational costs by implementing best practices with respect to process automation, procurement policies, application architecture, centralization, and service-level adjustments. However, the greatest factor in these reductions has been the implementation and enforcement of strong standards by these enterprises in outfitting their users and in particular the shift to certified image builds. The creation and maintenance of a well-tested image are now a standard practice in more than 80% of Global 2000 (G2000) organizations. Standardized images are most beneficial during software upgrades and migrations. Our research indicates that organizations with a few heavily standardized builds have significantly lower migration costs than more heterogeneous companies with more PC builds. This is especially true for large-scale OS migrations (e.g., Windows XP). Moreover, the efficiency with which IT organizations can roll out new applications and systems has increased dramatically.
The downside of standardized images is their fragility when confronted by hardware variations. If the new PC is not a match for the original master machine, the operating system and applications on the new machine are likely to fail. Windows 2000 and Windows XP have reduced some of the complexity of testing PC hardware permutations to certify builds through their enhanced ability to recognize and implement different drivers for minor hardware variations, which results in more forgiving images. However, each build must still be recertified (i.e., fully tested and retuned) for even minor variations that crop up in the PC fleet. Our research with G2000 organizations indicates that this requires approximately one annual full-time equivalent (FTE), or $100,000, for a fully burdened employee to certify a PC image properly. Because switching vendors virtually guarantees the need to rework all images, this is one of the primary switching costs when enterprises change PC vendors or models. Currently, 65% of the G2000 have a PC fleet that consists primarily of a single vendor for client systems, and we expect this to rise to 80% by 2005. For standardized images to work without creating a burden for the desktop engineering group, systems must remain consistent for as long as the enterprise purchases the same model (typically 12-15 months).
The process begins with product specification. The items most used to specify systems (e.g., processor, memory, disk) have little impact on the image, whereas items that have traditionally been left to the vendor (chipset, graphics adapter, basic I/O system [BIOS], network interface card, etc.) have a significant impact. Notebook computers generally present even more serious consistency issues, due to the larger number of embedded components (e.g., modems, network interface cards, power management, PC card controllers). Short product life cycles and the rapid churn of mobile technology further exacerbate the situation. Enterprises must change the way products are specified by adding much greater detail to configurations as part of purchasing arrangements. They must go beyond specifying model number, processor, memory, and disk size. Buyers should focus more attention on selection of the chipset than on selection of the processor. Currently, organizations should select chipsets with the “G” extension (e.g., 865G), since these have the longest and most stable path.
To assist with consistency issues, first-tier PC manufacturers provide “workhorse” models that do not change for extended periods of time. With these models, vendors identify specific configurations and models for which they will ensure that the BIOS, chipset, and basic drivers will not change for 12 months (or sometimes longer). During this time, the vendor may provide optional updates that enable the buyer to decide whether to implement the change or not, but arbitrary or mandatory changes are eliminated. Organizations should focus on purchasing workhorse models that carry a consistency guarantee. Furthermore, buyers must communicate the importance as well as the terms of consistency to the vendor (and any channel partners), because no vendor will provide a truly consistent platform unless it is specifically requested. Buyers should also minimize the number of PC vendors in the environment and maintain a strong working relationship with a strategic Tier 1 supplier, avoiding switching vendors frequently. Buyers should also insist that the vendor provide early, proactive, and complete disclosure of image-impacting changes; we recommend 30-90 days notice before changes are enforced. Such notifications should be automatic, timely, and complete enough that a customer is able to prepare for the change.
Beyond workhorse models, PC vendors are also providing tools and programs to aid in the maintenance of images. Under its ThinkVantage Technologies brand, IBM has released ImageUltra, a service offering that guarantees images will work on any IBM hardware, including future hardware releases, by creating one image containing all the drivers and applications for any hardware and software permutations. This eliminates the need to recertify images for minor changes. ImageUltra can also assist with software compliance issues by ensuring that only certain users get certain software packages installed with their image. As well, customers can use the new ImageUltra Builder tool to extend the functionality for non-IBM devices, though this usage is more limited.
Extending consistency enablement further up the supply chain, Intel has recently announced its Intel Stable Image Platform Program (SIPP - formerly code-named Granite Peak). Under this program, Intel guarantees that selected chipsets (and their associated drivers) will remain stable for 15 months, with the platform being updated annually. This enables enterprise buyers focusing on systems built on that package to have a stable platform for three months of evaluation and 12 months of delivery. Intel is providing this program on both desktop and mobile platforms. Intel's SIPP is one of the key messages associated with Centrino, and is being delivered with the 865G chipsets (formerly code-named Springdale).
Buyers must also be realistic and create processes that can accept some level of inconsistency, since product refreshes are sometimes necessary (e.g., for parts availability or patches). Insisting on maintaining an impaired configuration will result in higher purchase costs, impaired functionality, and potentially reliability issues. System builds should be designed to minimize dependencies on specific hardware pieces (e.g., via use of plug-and-play devices), and known dependencies should be well documented. In addition, customers should investigate new software utilities and services that can assist with the image management process (e.g., Dell X-Image, IBM’s ImageUltra). Yet organizations must exercise caution when timing the switch from one model to another. Buying at the end of a product’s life cycle guarantees shortages and consistency problems. Likewise, buying too early in the life cycle should be avoided, because products inevitably undergo minor revisions during the first two months.
Business Impact: Lowering total cost of ownership for end-user platforms involves implementing a wide variety of diverse best practices, rather than focusing on a single component.
Bottom Line: Organizations must make PC system consistency a priority and adjust purchasing processes to minimize the likelihood of product changes, even when this means buying a slightly more expensive system.
META Group originally published this article on 8 August 2003.