Avoid big bang Vista upgrades — Gartner

Avoid big bang Vista upgrades — Gartner

Summary: Contrary to conventional opinion, the best approach to Vista migration may be temporarily supporting more than one version of Windows in your organisation

SHARE:
1

Rather than trying to migrate all your users in one go, a phased approach to Microsoft's new OS may actually prove more cost effective.

That's the view of Gartner principal research analyst Annette Jump. Speaking at the Gartner Midsize Enterprise Summit in Paris, Jump told the audience of IT professionals that the widely held view that supporting multiple operating systems is costly and complex is not an absolute truth.

"How do you move to Vista? Generally the conventional wisdom suggests supporting all users on one operating system is cheaper than multiple systems, but that doesn't include the cost of getting to one operating system," she said. "For around 60 percent of large enterprises, managed diversity makes sense."

Jump outlined a staggered migration path for a typical organisation that was using 50 percent Windows 2000 and 50 percent XP in 2004. By 2005 the company should have migrated to 25 percent Windows 2000 and 75 percent Windows XP. 2006 should see the company using 100 percent XP which will continue till the end of 2007. Then by 2008, the company should be looking to use around 75 percent XP and 25 percent Vista.

Gartner claims that operating systems take around 12 to 18 months to mature, so with Vista timetabled for release in January 2007, the analyst group is expecting mainstream adoption to start in the middle of 2008.

However Jump warned that Vista could slip even further back than the January 2007 shipping date — to March or even later — as Microsoft is already committed to missing the vital Christmas sweet-spot when many consumers choose to buy new PCs. The exact release date is largely irrelevant to most medium and large organisations who will probably wait at least a year before adopting the OS.

When and how companies choose to migrate to Vista also depends on what operating systems they are currently using, Jump said. "Migration depends on where are you in terms of existing OS. If you're on Windows 2000 then you have got three to four years to migrate, and so should start testing now," she said.

"But if you're on XP, then you can take it much more leisurely; you can wait till Vista ships and then migrate through hardware attrition or through a big bang upgrade or even wait for point release in mid-2008 which should have WinFS." Windows Future Storage (WinFS) is Microsoft's next-generation file system.

Jump also raised the issue of hardware compatibility, and claimed that only machines bought in 2007 will probably be around long enough or have enough of their "useful life" left to run Vista, based on a three year life cycle.

"The specs show that you need at least 512MB of RAM and a modern processor, so most machines sold now will be able to run Vista. The bigger question is whether those machines will actually ever see Vista, if you're looking at mid 2008 to adopt it," Jump said.

Companies should consider the migration process as beginning not when a machines is physically placed on user's desktop but when the inventory process of existing machines begins to discover what applications are in use and how personal settings can be moved onto the new machines. "Migration is a very painful but important process," said Jump.

Desktop Linux is still only used by a small minority of companies, and businesses should make sure they evaluate it thoroughly in terms of the cost associated with a migration, said Jump. "Despite the hype around Linux it remains niche and we see no real increase in volume over the next 12 months," she said.

Topic: Operating Systems

Andrew Donoghue

About Andrew Donoghue

"If I'd written all the truth I knew for the past ten years, about 600 people - including me - would be rotting in prison cells from Rio to Seattle today. Absolute truth is a very rare and dangerous commodity in the context of professional journalism."

Hunter S. Thompson

Andrew Donoghue is a freelance technology and business journalist with over ten years on leading titles such as Computing, SC Magazine, BusinessGreen and ZDNet.co.uk.

Specialising in sustainable IT and technology in the developing world, he has reported and volunteered on African aid projects, as well as working with charitable organisations such as the UN Foundation and Computer Aid.

adonoghue.wordpress.com/

www.greenwashIT.co.uk

Kick off your day with ZDNet's daily email newsletter. It's the freshest tech news and opinion, served hot. Get it.

Talkback

1 comment
Log in or register to join the discussion
  • By now most IT departments should have learned that the "one size fits all approach" doesn't work overall. Exceptions to the rule aren't an exception. They underline flexibility, ability to adjust, customer awareness, business process enabling, pro-active attitude, etc, etc.
    Hype words like centralization, consolidation, virtualization, outsourcing, outtasking, etc. make for big fat bonusses for some and yet another way of doing things added to the mix of to maintain things for the rest. And more often then not one failure or disaster (and/or a couple of 'comprimises' to 'keep things moving ahead') some time later (usually in something that was 'overlooked' because it wasn't that easy to centralize, consolidate or virtualize) vaporizes all the cost savings promised on paper beforehand.

    Exceptions are the rule. Department A can mostly suffice with locked down terminal like machines. Department B needs all sorts of industrialized equipment hooked up to various local ports. Some managers require mobile equipment. Some key users require software that's written so dirty you just have to run it locally to protect the rest. Others work from home a lot so their home PC needs baby sitting. Etc, etc. 6 months later things are turned around. And 6 months after that new ways of doing the same things will have been introduced. 6 months after that how things are done where and how have mixed up again. And so on, and so on.
    Strategic managers focus more on the overall and later then on the overfocused now. And they keep their options open. Better safe then sorry.

    So since IT departments should already be able to deal efficiently with a whole number of different situations and configurations, along with maintaining general organizational policies and the exceptions to that, Gartner isn't being very realistic in their advise. Mainly because adding yet another platform in various configurations to the mix is something that is bound to happen anyway so IT departments should be prepared for it. And because of that adding something else then just Microsoft shouldn't be that difficult as well.
    If that however is a problem somehow then your biggest worry really is wondering if your IT department (management, staff and tech) is really up to the task ahead in efficient and effective ways and most certainly in (disaster) situations when all kinds of underlying faults, issues and problems suddenly start showing their ugly head (you don't want to be beaten some more when you're already down). And thus if it's perhaps time to make organizational changes in positions that matter because, frankly, they (and you) might not have a real clue concerning IT overall and just do as the salesman orders. Which you can bet will be something completely different every so often and highlighted in many management magazines (the 'make the buzz, get the bizz' tactic). But more essential to the core business then getting the basics in order (which includes being able to deal with damage and change)? And once you have the basics in order would you sacrifice that to move the latest hype in fast, knowing that the next hype is waiting around the corner? Nah, most likely not.
    anonymous