Do you know what your data is really worth?
Although all of a business' information might be valuable, not all of it is mission-critical to operations. Yet it is not uncommon to find businesses that have all applications data residing on a single and costly high-end storage tier.
The surge of IT-related investments in recent years has caught the eye of cost-conscious executives. The mission of IT has always been to help organisations maximise the value of their information at the lowest total cost. Meanwhile, today, 70 percent of the resources in IT are spent maintaining and supporting the current infrastructure, and only 30 percent is spent innovating around the ideas and processes which can transform the business and drive new sources of revenue, profitability and customer satisfaction.
How can this trend be reversed? There are various strategies one can implement but customers must not loose sight of the importance of data classification.
Data classification combines value judgments made by any business with analysis of the behaviour of the information itself to create a multi-dimensional view that's critical to effective management and action. In today's budget conscious times, data classification is one of the most productive steps IT executives can take to manage their business information.
A new way to manage data
While data classification is a critical step on the path to effective information lifecycle management, it is not just a means to an end. Data classification is a fundamental process that drives value throughout an organisation by enabling the alignment of information to best address business needs while using the lowest cost storage for the job.
Initially, the process of data classification was used to help companies better address business continuity and disaster recovery. With volumes of information being created, organisations wanted to be sure the right set and copy of data could be retrieved if a disaster occurred. This was accomplished through the development of a framework for data policy, which has since become known as data classification.
The idea of developing data policies is not new. Its roots can be traced back to the mainframe objectives of years ago. Back then, hierarchical storage management (HSM) was the precursor to what we call information lifecycle management today. Nowadays, the rudimentary data policies that supported HSM have given way to a broader expression of data classification that must take into consideration today's complex, open IT environments.
Nuts and bolts of data classification
Data classification is designed to give businesses a full understanding of the value of data for seamless operations. To be successful, three key factors must be addressed: value, time, and cost.
Data classification aligns data value according to business drivers such as performance and availability, regulatory compliance, information protection, budgets, and new directions for business growth. It achieves this via the following approach:
A time continuum, the period from data creation to disposal, and the varying degrees of value associated with the data at each stage in its lifecycle, also must be considered. Last but not least, data classification involves the mapping of data to a logical and physical architecture. Cost comes into consideration in relation to the architecture chosen to support the data.
Two key exercises, which each take four to six weeks on average to complete, are fundamental to data classification. The first, application alignment, is the process by which businesses identify applications, understand how they meet specific business requirements, and then determine if they are on the right storage platform.
Done with customised classification modelling templates and tools from a qualified consultant, application alignment is all about asking the right questions such as: How often does the information need to be accessed? How much is budgeted for a new architecture? Posed to lines of business representatives and other stakeholders, the answers to these questions and more are essential in defining specific rules that govern the alignment of applications to appropriate storage tiers.
Rules created for specific tiers of storage, such as defined levels of performance or recovery times, also can be used to build a service catalogue. A powerful strategic tool, a service catalogue aids IT in securing sponsorship to carry out specific storage objectives. For example, if the information management needs of a line of business point to tier-one storage, but there are budget constraints, IT management can use the service catalogue to negotiate the next option which might be a less costly tier with lower service level objectives.
Interviewing the right people is also crucial to successful application alignment. Using formalised qualification criteria is the best way to target the right accounts and uncover the most appropriate contacts to work within these accounts. A joint meeting of minds with senior level management is important for providing the broadest perspectives that must be considered. And don't overlook the chief financial officer in information gathering activities. Industry research by some analysts has indicated that 40 percent of the time the CFO is the one making or signing off on an IT decision.
In addition, with the growing influx of new regulations, many companies are now employing compliance officers. Representing another valuable source of input, they should also be included in data classification fact-gathering exercises. This is particularly important in light of the fact that compliance is fast becoming a key data classification driver.
A second yet equally important exercise in the data classification process is assigning specific policies to the data created by applications. An example of this might be a Microsoft Exchange e-mail application that requires all the horsepower and online accessibility of high-end, tier one storage. However, if information generated by the application is rarely accessed after a period of time, but must be retained for regulatory compliance, a policy should be created to migrate that data to a less costly archived storage solution.
This would not only save money but also free up high-end storage resources for growing business-critical needs. It is also the conduit for tapping into future information lifecycle management capabilities.
Customers face key challenges in managing information due to the relentless growth of data, the increasing strategic importance of information, and the new fundamental characteristic of information - its fluctuating value. Data classification helps businesses to directly address these challenges by aligning applications to appropriate storage tiers and defining rules and policies for optimum data and resource management.
A key enabler for furthering IT initiatives, data classification also brings businesses a step closer to information lifecycle management readiness. This is the ultimate goal: a storage infrastructure aware of both applications and the lifecycle of information, for the automated and intelligent movement of data across all operations.
Mark Heers is national product marketing manager for EMC Australia and New Zealand.
If you would like to become a ZDNet Australia guest columnist, write in to Fran Foo, Editor of Insight, at firstname.lastname@example.org.