Organisations that are slow in adopting virtualisation should stop messing around and just do it, according to the University of Melbourne's technology services team leader Drew Poynton.
Poynton and his team of five cover all Microsoft services used internally, including Active Directory, Microsoft Exchange, and the university's Windows Server fleet, which 110,000 staff and students depend upon. He takes care of Melbourne University's VMware-based virtual environment, as well.
Melbourne University spent several years virtualising parts of its IT infrastructure. It has two on-site datacentres, with 10 physical servers running 600 virtual machines. The Microsoft team has virtualised 75 percent of its IT environment to date. So far, virtualisation has saved the team money, and put existing hardware to better use.
Poynton believes that virtualisation should be rapidly adopted without hesitation.
"Some people take it too slow and put it in thinking, 'OK, we'll play with it and test it'," he told ZDNet at TechEd 2012. "It's proven technology now — three to four years ago people were still a bit iffy about it, but now it's proven and nearly everything has some form of virtualisation in it."
The key is planning before deployment of a virtualised environment, Poynton said. This includes ensuring that the staff members using it understand the technology.
"These days, it's quite simple to set up the environment, and it's easy to fall into the traps if you don't know exactly what you are doing, so you can hit bottlenecks quite quickly," he said. "So as long as you set up the best practice and have people that are trained well enough in the environment, there's no need for a long testing period.
"Just get in there and give it a go."
Growing pains of virtualisation for Melbourne uni
Melbourne University's Microsoft services team jumped head first into virtualisation, and has learned a lot since moving to a VMware environment three years ago.
As more people began using the virtualised platform, the team found itself taking care of tier-one and tier-two applications, and got to a point where it could not afford to have an outage in the virtual environment, but VMware's alerts system was inadequate for its needs in detecting and reporting on critical issues that have the potential to cause outages.
"The built-in tools in VCentre were not very configurable, and wasn't really set up like a dashboard so it can be just email alerts," Poynton said. "It's not very good at doing metrics and working out things over long periods of time, so we started looking at monitoring tools."
He brought on the Veeam monitoring systems pack to fix the problem, and has been running that for a few years.
Another challenge of operating a virtualised environment actually has less to do with technology, and more to do with the mentality of the staff members running applications on it.
"They still ask for large virtual machines, and that's something you have to try to get them to rethink," Poynton said. Application staff members were gradually educated about this, and the situation has improved.
Many application vendors also previously didn't have the specifications for running software in virtual environments, he said.
"But I think they've got to a point where they have specs for virtual environments and are actually differentiating between physical and virtual deployment," Poynton said.
Moving forward, the Microsoft services team will be testing out and comparing the new Microsoft Hyper-V technology with the existing VMware product.
"The new features in Hyper-V 2012 matches that of VMware, and my team is for Microsoft technology, so it makes sense to go down that path," Poynton said. "There are also some features not available on VMware like dynamic memory control, which would be advantageous for us to pursue."
Spandas Lui attended TechEd 2012 as a guest of Microsoft.
Updated September 5, 5:41 p.m. (12:41 a.m. PT): Corrected number of physical servers. Clarified Microsoft services team's involvement with Hyper-V.