Managing data at Melbourne IT

Summary:Managing data can be difficult, especially if you have almost 500 terabytes of storage and spend $10,000 a month on backup tapes. This case study looks at how Melbourne IT, one of Australia's biggest web hosting companies, handles storage

case study Managing data can be difficult, especially if you have almost 500 terabytes of storage and spend $10,000 a month on backup tapes. This case study looks at how Melbourne IT, one of Australia's biggest web hosting companies, handles storage.

The Web 2.0 problem
Going back a year or so, Melbourne IT chief architect Glenn Gore had a problem. His company's datacentre demands were growing exponentially; 120 per cent year-on-year. Melbourne IT's "virtualisation farms" — groups of virtual servers used by customers — were growing even faster: 400 per cent year-on-year.

The cause? Web 2.0 applications. "If you look at the use of IT, particularly online applications, there has been a maturing in the way people are using [the web], particularly for promoting your business online and having your customers interact through businesses through the internet," Gore tells ZDNet.com.au in a recent interview.

Glenn Gore, Chief Architect, Melbourne IT

Glenn Gore
(Credit: Melbourne IT)

This meant serious changes behind the scenes. "What that meant at the back end of suppliers supporting those sites was that the amount of processing power and storage required for that went up exponentially," says Gore.

With 660 staff primarily based in Australia, Melbourne IT has a history of dealing with web hosting. Beginning with registering a ".com.au" domain in 1996 and being accredited by ICANN to provide registrar services in the .com, .net and .org domains in 1999, Melbourne IT grew to host 6 million domains in 2008.

However, the company still found itself surprised by the onslaught of storage and processing requirements that came with Web 2.0 applications.

"Every year we are quadrupling the amount of processing power we have on tap, and that's because of processing the number of advanced internet applications we have," Gore says.

Big storage means big problems.
Gore's first approach to these problems was to virtualise across both storage infrastructure and processor farms, and learn to dynamically move load depending on customer demands.

"Those farms are constantly evolving. Every day we are adding servers and removing servers. The system itself is moving load around, depending on what's going to get the best performance for customers," Gore says.

This meant a system that could not only handle redundancy and failures, but also make autonomous decisions.

Gore says the systems are "actually making automated decisions based on business rules, how it deals with things like what happens when a server runs out of memory or when the CPU becomes a bottleneck". In short, the systems will work out how to move customers to different infrastructure, based on business rules and in real time with no outages.

To create such a system, Melbourne IT needed to define how to best utilise its hardware to meet its customer needs. "We use off-the-shelf software for the management component," says Gore. "What we did create ourselves is the IT and the business rules. What we have really done is merge the software management infrastructure with the physical hardware management infrastructure."

It wasn't a real challenge for server virtualisation to create this kind of intelligence. "Everyone knows how to do server virtualisation, VMware has been banging that drum for a while and Microsoft has entered that space," Gore says.

Every year we are quadrupling the amount of processing power we have on tap.

However, storage virtualisation proved to be more of a challenge. "If you're using a big chunk of storage, and you need to move to something more powerful and newer, it's normally been a big process requiring down time," the IT architect says. "It's also quite risky. Basically it ends up becoming so hard to do you don't end up doing it."

Melbourne IT has a lot of storage. Gore estimates its total current storage at "just under half a petabyte," or roughly 500 terabytes. Most of this storage was acquired recently.

"We made a very large storage purchase last year, of 330 terabytes in one hit," says Gore. "We are getting to the point where we now are actively looking at purchasing more storage because we have exhausted that purchase."

Gore also had the issue that customer's storage demand varied considerably throughout the year. "We could have customers that have sports activities, so they are only used for certain parts of the year, or only when games are on. We have financial institutions that get busy at months end," he says.

So big storage meant big problems, but storage tiers allowed Melbourne IT to migrate dynamically when customers hit peak demand.

"We invested in storage virtualisation, so now we are able to monitor the amount of storage being offered to a customer, and based on that work out whether we should move that storage dynamically to different performance tiers," the IT architect says.

"We can move storage up the tier, and we can also move it back down the tier. That's really important to us."

Battle of the titans
With a view to a long-term solution, Melbourne IT recently went out to tender for a storage vendor that could meet the needs of a massive IT environment. "We narrowed it down to three vendors: IBM, EMC and Network Appliance," Gore says.

I think ... if you talk just about the raw storage capabilities, EMC storage is probably better than the IBM storage.

Network Appliance was ruled out first due to being too expensive, but Gore says not everyone might have that problem. "It was just true for the characteristics of what we were buying," he says. This left two serious contenders. "To be honest it was a really close decision," he added.

"I think ... if you talk just about the raw storage capabilities, EMC storage is probably better than the IBM storage. But you have to take that holistic view of, 'how do all the components of my storage fit together?'"

This led Gore and Melbourne IT to its final decision.

"We were already an IBM storage shop and that really put them across the line," the IT architect says. "We also thought that the IBM storage technology had been out to market a lot longer, they had a lot stronger customer base, so we thought it was the right solution for us."

"The IBM storage product was a lot more mature, it has been out to market for years compared to months ... you're only really as good as that weakest link."

Turn over to see Melbourne IT's backup strategies and their views on the future.

Topics: Storage, Tech Industry, Virtualization

Kick off your day with ZDNet's daily email newsletter. It's the freshest tech news and opinion, served hot. Get it.

Related Stories

The best of ZDNet, delivered

You have been successfully signed up. To sign up for more newsletters or to manage your account, visit the Newsletter Subscription Center.
Subscription failed.