Cloud archives start to melt

Who would have thought it? Archiving to the cloud sounds like one of the dullest applications going but a fire has suddenly been lit underneath it.

As Jack Clark highlights in his blog , Amazon's new Glacier service allows you to archive data to its datacentres for a measly $0.01 per GB per month. He also shows how it costs 10 times as much to use Glacier as it would if instead you used your own tape library, and other associated hardware and software.

Quantum supplied the figures for the cost of hardware rather than a service and, of course, Quantum has a large footprint in the tape library market.

Additionally, as some commenters point out, at 10PB over five years, the volumes of data on which this calculation is based are huge. Most companies — small to medium-sized enterprises — are more likely to want to archive off maybe 10 to 50TB: several orders of magnitude less. At this level, Glacier is likely to work out cheaper. Much cheaper.

I could ask Quantum to do the maths for me for this volume of data but I see that Quantum has just launched its own Glacier-like service in the form of Quantum Q-Cloud.

Q-Cloud's pricing is identical to that of Amazon Glacier when storing above 72TB but, instead of sending your archived data directly to the datacentre, you stage it using a Quantum DXi deduplicating disk-based device, which then deduplicates and synchronises to a Quantum datacentre in the background. This has the advantage of allowing local retrieves, as long as the data remains recent.

This is brave stuff. You do need some more software and hardware components — available from Quantum — but you can also get your data back more quickly. Amazon is quoting several hours, while Quantum reckons you can have your data back much faster. Looks like Quantum is hedging its bets.

This won't be the last cloud archiving initiatives we see. With Amazon's entry, the bar has been raised, and it remains only to wait and see who else tries to leap over it.