During 2001/02, IT organizations (ITOs) focused on more effective use of existing storage assets and various cost-cutting measures. With the improved economic climate, ITOs are now taking a more strategic look at storage infrastructure development. This Practice provides a template as a starting point for creating and articulating a storage strategy that aligns technology with changing business requirements, as a process rather than an event.
META Trend: Through 2005, 2Gb Fibre Channel (FC) will remain the dominant storage-area network (SAN) topology for Global 2000 data centers, driving storage utilization improvements of 25%+ versus direct-attached solutions. Beginning in 2004, emerging 10Gb Ethernet and FC, iSCSI, and common InfiniBand-based I/O subsystems will interconnect and coexist with FC in SAN backbones. By 2005/06, maturing network storage management tools will drive 10x+ improvements in storage administrator efficiencies (e.g., 6TB to >60TB/administrator).
Storage management has evolved into a discrete IT activity requiring an increasingly specialized skill set and organizational focus. Although storage specialists have been part of mainframe data center environments for many years, we estimate that fewer than 25% of Unix and Windows ITOs have a storage management team per se. Due to the increasing cost and complexity of storage systems, more than 75% of Unix/Windows ITOs will create such a team by 2005/06. Storage hardware costs will decline by 35% through 2008/09 on a per-unit basis; however, for total storage spending to remain flat, growth cannot exceed 54%.
Unfortunately, since computer science schools offer a scant number of courses in storage management, ITOs will be compelled to either develop appropriate skills internally or hire storage managers from other organizations. Through 2007/08, storage management skills will be in disproportionately high demand compared to other skill groups (e.g., for servers, DBMSs, and application development). Moreover, as new storage management products are offered through 2005/06, it will be necessary for ITOs to invest in training/learning activities for storage administrators. This burden will range from 80 to 240 hours per person.
It is the combination of these three issues (cost, complexity, and skills) that makes persistent strategic storage planning an imperative activity. Currently, most storage planning is limited to hardware/software selection and configuration management. Yet, storage deployment decisions have pervasive effects on overall IT initiatives (e.g., high availability, application performance, business continuance, compliance). Therefore, the view of storage management practices must be set within the context of total business management. This primer is designed to provide ITOs, and storage managers especially, with a template that can be used in creating a storage strategy document. The end product is intended to not only be a usable document for the ITO, but also communicate the strategy to non-IT managers in a manner that is understandable at a business level, further promoting senior managers’ confidence in the ITO. A successful strategy document must have the following components:
In most cases, a storage strategy document is intended to foster executive approval for upcoming expenditures. Unfortunately, storage is an inherently “techie” topic that few people outside the IT industry understand. Therefore, storage planners should not expect non-IT professionals to read beyond the executive summary except for a cursory scan. Consequently, the success or failure of the plan may depend on the executive summary.
A good executive summary should be no longer than two pages in length and be digestible in fewer than five minutes. The reader requiring more detail can then delve into the full text of the document. Fundamentally, the executive summary must answer the following questions: 1) What? 2) Why? 3) When? and 4) How much (e.g., cost, staffing)? Although this section should speak directly to the recommended course of action, it should also contain enough discussion of alternatives to demonstrate that the planner has thoroughly weighed the options. However, any discussion of technology should be in high-level business terms. For example, rather than stating, “We recommend that a storage-area network be implemented to improve manageability,” the planner might say, “We recommend that storage be consolidated into a common pool, so that assets can be more adaptive and fully used.”
Although hardware/software configuration is a component of the current-state analysis and must be thoroughly documented, the key question to answer in this section is, “What’s wrong?” An analysis of strengths, weaknesses, opportunities, and threats (i.e., a SWOT analysis) is often the most effective means of communicating these issues (see Figure 1).
Up-to-date storage system benchmarks are also an important part of the current-state analysis. Examples of these benchmarks include storage per administrator, average uptime, and average recovery time. Although these metrics can be compared to industry standards or best practices, such comparisons are often either unavailable or meaningless, due to uncontrollable variables. These metrics should be used instead as a “setup” for the measurable results and success criteria presented later in the report.
The Business Environment and Business Drivers
Making the connection between business drivers and technology is the key not only to an effective storage strategy but also to establishing organizational credibility. Completing this section of the report is more of a due-diligence exercise than a presentation of in-depth research. The business environment and drivers are macro factors, and as such, they do not drive specific storage deployments. Instead, these macro factors (e.g., industry maturity, volatility, profitability) will drive deployment philosophies. For example, organizations in mature industries with low gross margins (e.g., food suppliers, utilities) are more likely to adopt only proven technologies with a lower risk of deployment. On the other hand, companies in developing or volatile industries (e.g., technology firms, aerospace) may be willing to take a chance on promising but unproven technology in hopes of gaining a competitive edge. Neither choice is right or wrong per se, but the decision must be made consciously so that it is understood by all participants and consistent in its application across the organization.
Internal business factors will also drive strategy decisions. For example, organizations that are geographically dispersed have different needs than those that are centrally located. Plant expansion or contraction, as well as merger/acquisition intentions, will affect architectural decisions. Determining such intentions or eventualities can probably be accomplished through a few conversations with business managers.
The second aspect of aligning business with technology is forecasting the technology. Whereas technology assets typically have a three-year useful life, evolutionary and revolutionary technologies may be introduced more frequently. Knowing which emerging technologies may be beneficial (and when) will help the storage planners to monitor technology adoption and avoid pre-adoption hype.
In the same way that it is critical to have understanding of the organization’s business environment, the storage planners must assess the state of the technology they are evaluating. For example, mature technologies (e.g., storage arrays) are relatively less risky to purchase than immature technologies (e.g., storage management software or virtualization). Mature technologies can be planned and purchased on a more strategic (long-term) basis, while immature technologies should be purchased in light of the tactical horizon (e.g., two years).
Although many organizations do not have formal storage policies or service-level agreements (SLAs), most have these at least on an informal or intuitive basis. Complete storage strategies will state policies and SLAs explicitly. These statements must be noted in business terms so that business managers can make decisions regarding which service level is needed for operation (see Figure 2).
Analysis of Alternatives
The analysis of alternatives section provides further detail related to the executive summary and ensures rigor in developing the strategy. Its main function is to facilitate discussion among business managers if they wish to delve into the various options. A modified “Ben Franklin” diagram is the easiest way to communicate this information (see Figure 3). The analysis should always include the “do nothing” alternative, which is almost always a viable option. Furthermore, although numerous alternatives may be available, only the two to four most serious options generally need to be presented.
Discussions of storage strategies are inherently technical in nature. Therefore, the audience for the architectural section of the strategy is the IT organization/managers. Diagrams of the architecture are useful here, but a connection must be made between the architectural decision and the service-level delivery, including any calculations that will help support improvement claims, such as uptime/recovery benefits.
In addition to hardware layout, the architecture discussion should include software strategy. For architectural strategy, laying out a storage management taxonomy will help to determine which components are necessary and which are not, or even which ones may be added over time.
An implementation plan should factor in external events (e.g., business drivers such as new plants or new regulations, emerging technologies, etc.) and the associated actions against the third element, which is time. Actions, of course, are those items that the IT organization must accomplish within the specified time frame. The implementation plan should be as detailed as is realistically possible. However, most of the detail will be for the first year of the implementation plan. As the timeline is extended, future events become less certain, and therefore planning is less detailed.
Two other elements must also be addressed in the implementation plan:
No plan can be approved or considered complete without an estimate of costs. Cost estimation is a well-understood exercise that is not discussed in detail here. It is important to note, however, that alternatives must be compared over the planning horizon in order to make an objective choice.
Summary of Assumptions and Contingencies
All planning exercises include assumptions that affect the various components of the plan. For example, it is generally assumed that the economy will remain steady, the business will stay healthy, vendors will deliver as promised, etc. Yet few of these assumptions are actually accurate over time. A solid plan identifies key assumptions and develops contingency plans for both good and bad outcomes. Although organizations tend to focus on negative events such as business downturns, management of unexpected business expansion can be just as challenging.
The evaluation plan should answer the key question, “What constitutes success?” Success may include subjective criteria as well as objective criteria, such as those described in the preceding “Current State” section. Moreover, evaluating the plan requires that the participants periodically sit down to review the plan, grade performance, and make adjustments. This type of review transforms storage planning from being a one-time event into an ongoing process.
Business Impact: Storage is a significant component of the IT budget. Developing a strategic planning process is the most effective mechanism for controlling costs while also meeting or exceeding organization requirements.
Bottom Line: A comprehensive storage strategy will provide an action plan for IT organizations as well as provide non-IT managers with a means of understanding the importance of storage for the organization and of knowing how to assess it.
META Group originally published this article on 17 September 2003.