Through 2007, ill-defined IT deliverables, misaligned business expectations, and unknown performance levels, combined with economic pressures, will force more than 30% of IT organizations (ITOs) to implement formal quality improvement initiatives. Successful IT groups will leverage the successful quality initiatives implemented within their lines of business (LOBs; e.g., Six Sigma), modifying those initiatives to make them more IT relevant.
META Trend: Through 2008, IT operations groups seeking to effectively develop and enhance their operational processes must formalize their efforts, focusing on process definitions, performance measurement, and analysis of potential refinements - ultimately creating a culture that embraces continuous improvement. Although most IT operations groups' efforts are still in their infancy, significant gains will be made by leveraging the process refinement practices experienced by both IT (e.g., ITIL) and non-IT oriented (e.g., Six Sigma) organizations.
Seventy-five percent of internal and external IT suppliers are experiencing difficulties in providing evidence of technology performance and return on investment (ROI) to their customers. Although many IT operations groups (ITOGs) have made strides in the development of IT service catalogs and modest improvements in IT performance reporting, more than 80% of ITOGs are simply unprepared to provide “evidence” of performance. Unfortunately, this problem makes it difficult not only to sustain value-based relationships with LOBs, but also to justify staffing levels and delivery costs.
Following recent economic pressures, enterprises are increasingly demanding that ITOs be held to the same levels of consistency and performance as their LOBs. This issue has been compounded by ITOGs that have sought to satiate IT customer expectations by delivering unsupported service-level agreements (SLAs) and business relationship management (BRM) initiatives. Although fewer than 10% of ITOGs have shown interest in quality improvement initiatives, we believe this number will grow to about 30% by 2006. Through 2006, the failure of ITOGs to effectively structure, staff, and set expectations for their quality management initiatives will result in the failure of more than 50% of those initiatives. During this period, Six Sigma will emerge as the leading quality management initiative, though modification to the methodology will be required for ITOGs to succeed. Significant risks to quality management success (e.g., the establishment of performance baselines; cultural issues; process integration inconsistencies) will be reduced via executive support and the inclusion of knowledge transfer/tools.
The Six Sigma Quality Methodology
Six Sigma is a measurement-centric approach and methodology for eliminating defects in any process. Although it is one of the most popular quality management disciplines in the manufacturing environment, its purpose and basis are still relatively unknown in the IT operations world. As a result, some ITOs have blindly headed into a Six Sigma quality management initiative without fully setting expectations, modifying the methodology to best fit an IT environment, or effectively communicating the goals of the initiatives.
There are two primary submethodologies within Six Sigma: DMAIC (define, measure, analyze, improve, control) and DMADV (define, measure, analyze, design, verify). The DMAIC process is an improvement method for existing processes for which performance does not meet expectations, or for which incremental improvements are desired. It requires ITOGs to define a particular performance area, establish a performance baseline, analyze and determine the root cause(s) of the defects, and modify the process to reduce defects.
Six Sigma is an IT-appropriate process-improvement methodology, though the fundamental objective is to reduce errors to fewer than 3.4 defects per million executions (regardless of the process). Given the wide variation in IT deliverables (e.g., change management, problem management, capacity management) and roles and tasks within IT operational environments, ITOGs must determine whether it is reasonable to expect delivery at a Six Sigma level. To put this issue into context, a Six Sigma performance level equates to roughly 99.9999% accuracy. In most IT operational environments, 99.999% accuracy is often regarded as the pinnacle of performance and is typically seen only in environments that are highly automated and reserved for very discrete technologies (e.g., server availability). As a result, one should not necessarily expect IT performance to be held to the same defect rates as certain manufacturing groups. We believe that through 2006, ITOGs engaged in Six Sigma efforts will follow the process-refinement strategies within the methodology (i.e., define, analyze, and improve) but will not adhere to strict Six Sigma performance levels.
Effectively Setting Improvement Targets
The objective of the Six Sigma methodology is the implementation of a measurement-oriented strategy focused on process improvement and defects reduction. A Six Sigma defect is defined as anything outside customer specifications. Because the outputs of IT are not as consistent as they are in manufacturing processes (and should not necessarily be expected to be), it will be the responsibility of the ITOG to define the processes, expected outputs, and defects. As a result, through 2007, ITOGs that implement quality management initiatives (e.g., Six Sigma) effectively must do the following three things:
Through 2006-08, ITOGs will begin standardizing platform-agnostic, sustaining-work activities, as well as the integration points among these activities, bolstering ITOs’ ability to enhance performance across the IT delivery life cycle and reporting/improvement activities (e.g., quality, cost reduction, reporting structures). By 2006, 60% of Global 2000 organizations will deploy first-generation process bundles (e.g., change/production acceptance, customer advocacy, asset management) as virtual centers of excellence, formalizing/tuning these process groups as required through 2008.
In many cases, ITOG process development initiatives will serve as the basis for many improvement/Six Sigma initiatives. Conversely, ITOGs that expend effort on process definition/development, without the incorporation of baseline measurement with the intent of managing ongoing performance, will fail to maximize process delivery.
Major Customer Pains
Numerous major pain points will challenge customers interested in implementing Six Sigma and other IT quality initiatives through 2006. Among these pain points will be ITOG ability to incorporate and manage the Six Sigma initiative in addition to the ITOG’s day-to-day activities (similar to other ITOG project work efforts). Indeed, success will be attained by fewer than 25% of organizations taking on Six Sigma. Project set up, project prioritization, cultural misalignment, and ongoing management of new efforts will be among quality-related issues that can potentially cripple success. Of those organizations that do realize benefits, success will depend on the following:
User inexperience with Six Sigma will drive many users to seek assistance for Six Sigma project setup and management. Numerous vendors (e.g., Microsoft, Niku, Instantis, Breakthrough Management Group) released select Six Sigma management tools in the past 12 months. Although the market is still nascent, Six Sigma tool vendors will continue to proliferate through 2006, releasing relatively low-cost products featuring the following:
Will Six Sigma Initiatives Save Cost?
Cost savings should not be the sole focus of a Six Sigma undertaking. Indeed, the current state of the IT environment is not well defined from a process perspective, and therefore it is quite difficult to establish performance and cost baselines. In fact, a recent META Group survey reported that fewer than 55% of ITOs regularly track performance metrics. As a result, the majority of ITOGs will experience initial cost increases associated with setting performance watermarks.
Similar to the IT product portfolio philosophy, ITOGs implementing Six Sigma must balance cost drivers when optimizing components of their IT products/services. To effectively manage the ultimate cost of a delivered product, it must be recognized that the final deliverable is often tied to multiple processes, hardware, software, etc. As a result, it may be necessary to increase the cost of certain variables (e.g., specific automation) to attain a reduction in the final cost of the item delivered. Therefore, it is critical that the ITOG effectively identify what the ultimate deliverables will be, with all relevant cost drivers associated with it; otherwise, the expectation of cost reduction across each of the supporting processes may result.
Because most ITOG participants learned to perform their IT roles via trial and error, there tend to be few (if any) true cross-platform process standards that exist in the IT operational environment. Indeed, fewer than 5% of ITOG have their processes defined to the “task” level. Multivariant process performance dependencies, the identification of cross-process relationships, and effective automation of those workflows via tools have not historically been managed across the IT operational environment. As a result, many individuals often feel threatened by any impending quality management initiative. In fact, issues associated with corporate culture, such as the ability of the individuals participating in the project to establish project timelines, goals, deliverables, etc., and the ability of the group to coordinate across their domains, may be the most significant detriment to an organization’s Six Sigma efforts. ITOGs will find it increasingly difficult to manage to Six Sigma levels, especially given the current status of process definitions and performance measurement.
To combat some of the potential negative impacts brought by culture, ITOGs should ensure that improvement initiatives are finite enough to prevent the project’s scope from encompassing any more than 10% of an individual’s “whole job.” This will reduce the potential “threat” the Six Sigma project may pose to an individual and limit negative impacts from personal interests (e.g., intentionally not sharing knowledge, not working with project managers).
Real Versus Artificial Quality Gains
Unfortunately, there is a real risk associated with setting any performance target. There is often a temptation to present performance figures that are technically accurate (e.g., availability of an application running on the server) and which promote a positive impression of the area in question yet are irrelevant to the overall performance of the business. This issue has plagued IT performance reporting for years (with negative results) and holds true for IT Six Sigma initiatives.
Through 2006, ITOGs that are serious about improving performance must have a management group that is willing to report true IT performance (both good and bad) to the business, effectively baseline expectations, and correctly communicate performance improvement over time. This may come easier to some IT groups that have been “required” to implement Six Sigma by their top-line business executives, rather than a “self-imposed” improvement initiative.
High-Level Six Sigma Project Example Targets: IT Change Management
Bottom Line: Through 2007, IT groups engaging in quality initiatives (e.g., Six Sigma) will be required to first define their quality targets (e.g., processes) and identify realistic improvement goals, managing customer quality expectations accordingly. Pressure to attain manufacturing-quality levels and cultural difficulties will cripple more than 15% of IT quality improvement initiatives.
META Group originally published this article on 10 April 2003.