X
Tech

How firms justify storage spend

Adding more disks may never have been cheaper, but the trick is working out the true cost of that storage
Written by Manek Dubash, Contributor

Storage may seem cheap, but purchase price is a small part of the cost. That is why IT managers are having to find new ways of measuring the economics of storage, says Manek Dubash.

You might think there has never been a better time to buy storage. But if you are an enterprise IT manager, your budget is flat, and you are being asked to do more with less.

Yet we have never needed more storage and, driven by the huge growth in the use and storage of video, and by legislative requirements to keep ever-growing mountains of data, demand continues to increase. The precautionary ethos often becomes 'store everything forever, just in case'.

Can new technology solve the problem of never-ending demand? Constant price falls encourage this attitude. The raw cost of storing a terabyte of data on rotating media is now well below £100. Increasingly, too, the potential of solid-state disks (SSDs) is starting to be realised, although the cost per gigabyte seems destined to remain well above that of rotating media.

Acquisition costs
So far, so familiar. But buying more storage is being viewed increasingly as unsustainable at a time of flat or falling IT budgets, because storage hardware acquisition costs are only the most visible element of an iceberg of spending in this area.

According to IDC analyst Nick Sundby, managing the rapid growth of data while keeping it secure, protected, compliant and resilient is a challenging task that is subject to a complex set of changing user priorities, service-level agreement requirements and regulatory directives.

"Add in the budget and staff restrictions that are now commonly seen and the scale of the challenge is even greater. Faced with these restrictions, many customers can no longer buy storage capacity 'by the yard' to meet the company's growing demands," Sundby says.

So how do you make sense of today's storage market, when the technology is rapidly changing, when value-for-money equations seem founded on shifting sands, and the problem of managing demands for storage appears destined never to achieve resolution?

Econometric approach
Sundby says enterprises need to adopt an econometric approach to storage management. This technique combines the return on investment (ROI) and total cost of ownership (TCO) approaches to IT purchasing.

However, Sundby says if such approaches were simple to execute, they would be widely used to evaluate proposed storage investments. "The reality is that for many companies, the IT manager may be adept at evaluating new hardware and software technologies, but less confident in presenting a detailed financial model of the solution," he says.

In other words, IT managers need to find new ways of measuring the economics of storage in the knowledge that purchase price constitutes only 20 percent of the total cost of ownership. Those other costs include maintenance, management, security, and power and cooling.

As an example of how IT managers should be thinking, Sundby points to the approach of Hitachi Data Systems. "HDS has evolved an approach to storage economics based on some hundreds of TCO and ROI analysis studies that it has conducted with high-end storage customers around the world.

"[This approach] allows stakeholders to consider the financial benefits of a multi-tiered, thinly provisioned and virtualised storage architecture, which hitherto would have been a significant undertaking for some users."

David Merrill, chief storage economist at HDS, says the key objective is to measure...

...the real cost of storage, and "the simplest metric is unit cost per terabyte-year".

"You need to find your own sweet spot for price, performance and capacity", he says. "Each technology has a sweet spot, virtualisation, thin provisioning, deduping. And the determination of that is to do with age of the business, the application, geography and so on."

Carbon footprint
As an example, Merrill cites one very large software company that was interested in metric tonnes per terabyte. "What they cared about was their carbon footprint", he says.

He applies the econometric approach to the question of whether organisations should consider SSDs. "The best metric for SSDs is price per I/O operations, per second, per TB. It's better on physical disks but the gap is closing. But I wouldn't recommend SSDs to a web-hosting company or those doing credit-card processing, for example."

Sundby sees increasing pressure to adopt this approach, driven by increasing scrutiny of energy costs, excess capacity lost to Raid parity data, overprovisioning, and white space in databases, and as operating expenditure becomes an increasing proportion of the TCO.

However, there are some technologies that can act as a virtual Band-Aid, according to Merrill. Among them is deduplication of backup data, which saves shared data only once. For example, if you are backing up 100 PCs running Windows, your backups have 100 copies of exactly the same code. Deduping means only one copy is saved to disk.

Deduplication pitfalls
This approach is claimed by some to be capable of up to 500:1 compression, although others are more conservative: "There are, of course, some pitfalls [to deduping] but there are huge cost and space benefits to be had," says Peter Williams of Bloor Research. Williams reckons some of the wilder space-saving claims may be true, but depend on the methodology you use.

Similarly, reclaiming white space in databases can result in a 40 percent capacity clawback in big systems, according to Merrill.

But he says the big win is success in changing consumer behaviour. Rather than buying more storage, IT managers need to ensure that users are asked if they really need more storage, and to introduce a charge-back system that covers all the costs of additional storage, not just the price of a hard disk.

Editorial standards