In the face of a challenging economy, IT organizations are under unprecedented pressure to cut costs.
At the same time, they are tasked with the smooth operation of the data center, which is becoming more complex all the time as a result of continual consolidation, changes, escalating virtualization, and exploding data volumes.
Understandably, organizations are reluctant to spend any more money than absolutely necessary, which has limited or delayed the acquisition of IT equipment that otherwise might have been routine. Forecasts vary widely for the depth and length of any recession, but it is clear that 2009/2010 budgets are uncertain at best.
For most IT organizations, the focus will be on optimizing existing systems and extending their usual life. The good news is that many organizations have 50 percent or more available storage capacity that may accommodate the organization's needs for the foreseeable future.
Plus, many technologies exist that can help optimize existing systems.
During periods of economic growth, organizations may be tempted to take the "quick fix" to storage management problems. The incremental cost of adding storage is relatively small and can be absorbed by the budget. Such a short-cut may facilitate faster project roll-out, but it also leads to underutilized storage. Many organizations operate at only 30 to 40 percent utilization. According to InfoPro, the average is 35 percent.
Accurate storage allocation is difficult because data growth rate information is incomplete or unavailable. Consequently, storage allocation does not correlate to consumption. New applications, with no historical trend data, receive storage allocation on a "best estimate" basis. If the allocated capacity is too high, then the excess capacity may languish unused for the life of the array.
Needless spending is the primary consequence of benign neglect. Having an array only 50 percent utilized is like paying twice as much for the storage needed. Idle capacity also consumes power, increases cooling costs, and unnecessarily consumes floor space and maintenance dollars with no return on the investment. Moreover, storage array software licenses are typically based on total (or raw) capacity, not utilized capacity, thereby needlessly driving up the cost of software.
Storage management: Turning things around
To make better use of storage resources, organizations can leverage storage management technologies. Storage resource management (SRM), for example, enables IT to navigate the storage environment and identify old or non-critical data that can be moved to less expensive storage. These tools can also be used to predict future capacity requirements.
Managing storage without an SRM tool is like going on a journey without a map. Having a clear plan and objective before taking action is the best assurance of rapid progress and success.
Storage managers should ask some questions before cost-cutting:
- What is the average utilization rate?
- What is the utilization rate by application?
- Which applications are growing fastest? Slowest?
SRM technology can help companies make an assessment and provide an enterprise-wide view of the storage environment, which helps identify problem areas, consolidation opportunities, and to create a priority list of solutions.
In addition, thin provisioning can be used to improve storage capacity utilization. These tools allow space to be easily allocated to servers on a just-enough and just-in-time basis.
Thin provisioning can enable higher capacity utilization by allowing applications to share a pool of available storage that reduces the amount needed for any individual application. Storage is allocated to applications dynamically as needed, resulting in higher utilization.
Thin provisioning also eliminates the guesswork in new application provisioning because rapidly-growing applications can access space as needed, while low-growth applications will not hoard empty space.
Furthermore, thin provisioning can reduce capital expenses because it requires less up-front storage than a "stove pipe" environment and permits "just in time" storage allocation.
Data deduplication: Don't store it in the first place
Data deduplication is another technology that has gained wide acceptance as a tool to streamline the backup process. Deduplication eliminates duplicate data even when such data is unrelated, greatly reducing the data multiplier effect on data.
For example, if a Microsoft PowerPoint presentation is stored on different file servers multiple times, deduplication ensures that only one copy is stored no matter how many full or incremental backups occur. Organizations may consider specialized appliances to provide backup-to-disk and deduplication functions. However, these appliances add complexity to the data center with more devices to manage and actually add capacity to the environment rather than using what already exists more efficiently.
Data archiving: Out with the old
Thin provisioning and data deduplication are strategies for reducing the growth rate and space consumption of new data or finding more efficient ways of storing it. These strategies must be combined with addressing unnecessary data storage in order to fully utilize existing assets. The largest container of unnecessary and obsolete data is unstructured data.
E-mail is the biggest unstructured information pain point today and a top target for data reduction via archiving. The Radicati Group estimates that the volume of e-mail will increase by 30 percent from 2006 to 2010.
Although storage costs continue to fall on a per-unit basis, e-mail is often stored many times in the e-mail server, on the user's PC, in a Microsoft Exchange or IBM Lotus Notes file, on file servers, saved in SharePoint, and in backups. Because of the excessive storage consumed, the cost of power and cooling is also commensurately higher.
Across all business industries and public sector organizations, IT professionals are being called on to address the common management concerns around e-mail and unstructured information, which is resource management. Archiving technology will act as an online archive for older items that are moved from primary application storage according to company-defined policies. It also leverages optimized single instance storage and compression technologies to further reduce the data footprint.
By controlling the size of the message store, the applications and servers hosting them remain focused on real-time transactions. The online archive also enables organizations to rationalize their storage resources and dedicate primary storage to dynamic and transactional data. Older, less frequently accessed content can be moved to a secondary or tertiary storage device, saving money for more strategic purposes.
During the current economic downturn, IT organizations must take steps to optimize existing assets. Storage managers no longer have the luxury of cutting management corners. However, the situation presents an opportunity to complete existing projects while implementing processes, procedures, and simple technologies to improve the storage cost profile.
Phil Goodwin is the senior manager of storage management, Symantec Corporation. This article first appeared in ZDNet Asia's sister site, TechRepublic.com.