X
Business

Linux: Making the change

The idea of getting a robust, scalable operating system for free hasn't clicked with many enterprises -- until now.
Written by David Braue, Contributor
[? template("/zdnet/insight/linux/header.htm"); ?]

Moving to Linux used to be a big deal. Sure, it was cheaper, more reliable, and more flexible -- but who did you turn to when things went wrong? In an enterprise world that had grown up with the idea that Unix needed to be complex and expensive -- and that Windows was a quick-and-dirty Plan B -- the idea of getting a robust, scalable operating system for free just didn't click for many years.

Fortunately for Linux, the support structure that gradually built up around this rogue operating system -- which is now the favourite son of one-time Unix diehards HP, IBM, Novell and Sun Microsystems -- has dispensed with that fear. Supported by integrators and buoyed by ever-improving technology, all kinds of organisations are happily using Linux for a range of mission-critical services.

The enduring popularity of Linux is reflected in its market share statistics: IDC's latest Worldwide Quarterly Server Tracker figures showed Linux servers accounting for 13.6 percent of all server revenue, for a total of US$1.8 billion during the second quarter alone. That's a 19 percent increase in revenues over last year, and solid confirmation that the platform just continues from strength to strength.

Equally important, however, is the changing role of Linux: well advanced from its origins as a file-and-print server, Linux now manages services including mission-critical databases, enterprise applications, virtualisation of other operating system images, and massive compute clusters built out of large numbers of commodity servers. With its accessible code base and stronger ISV support, Linux has truly become everyman's operating system.

Or has it? Despite years of enthusiastic predictions, Linux still has yet to make a dent in Australia's desktop market -- even though the meteoric success of Ubuntu has made it a household name (in some households, at least). Efforts to locate a new desktop Linux customer to profile for this feature were met with chagrined resignation, suggesting that companies using desktop Linux are still being secretive about it -- or that there just aren't that many out there.

That doesn't take away from the importance of Linux as a server platform, however -- and, since Web-based applications have become so important in today's operating environment, odds are that you could do for a bit of server rationalisation. After all, why pay exorbitant licence fees just to keep up with peaks in demand for your Web site?

Of course, there will always be certain applications that just aren't available on Linux: anything based on .NET, for example, or graphics-intensive tools are likely to prefer Windows. So while there is a compelling case to move many of your backend services to Linux, it's also important to work out a strategy for keeping the two environments working in synch -- for example, keeping your data in a separate environment, such as a storage area network, neutral file server or platform-agnostic database, which is equally accessible from all platforms.

It still requires planning and careful execution, but the wealth of experience around Linux means it's no longer an unsupportable, risky proposition for most companies. It's probably premature for most companies to simply cut their ties with Windows altogether, since ultimately platform decisions are made on what's best for the business rather than some sort of religious determination. But by embracing Linux in the right places -- and, yes, even the desktop can be a "right place" for many companies -- you can take out some of the chaff in your IT environment and see benefits you never would have imagined just a few years ago.

[? template("/zdnet/insight/linux/header.htm"); ?]

Case study: Linux proves a firm foundation for Macmahon

Linux manages more than half of Macmahon Holdings' new IT infrastructure -- but the way Macmahon got there is just as interesting as what it got.

Snapshot on www.macmahon.com.au

  • Industry
  • Employees
  • Operations
  • Financials

Mining and construction contracting

Everyone is eager to see the possibilities of a new job, and Jason Cowie, chief information officer at contracting company Macmahon Holdings, was no exception. But when he began his role at the mining, civil construction and maintenance giant in 2003, Cowie soon realised the magnitude of the challenge before him.

With a 56Kbps modem as its Internet backbone and a melange of ageing Microsoft Windows and Novell NetWare servers clustered under a desk in a crowded server room that Cowie initially took for his office, it was clear from the beginning that the company -- which employs more than 3,000 people in Australia, New Zealand and Malaysia and turned over AU$966 million in revenues last year -- was long overdue for an IT infrastructure refresh.

The new infrastructure was about much more than just technology, however: with over 50 sites distributed in several countries -- many of them in remote areas where telecommunications was deficient and IT support sporadic at best -- Macmahon needed a more-with-less approach that would support its growth. That growth was significant: given the booming resources industry, user numbers at Macmahon have increased from 400 when Cowie started to more than 1,500 now.

Macmahon's IT team, however, is relatively small, with just three network operations staff and 10 people handling IT altogether.

Rather than rush into any particular architecture, Cowie first took the time to meet with business managers -- nearly 60 of them -- to spell out their business goals and their expectations from IT. Several key themes emerged: the new infrastructure needed to tie all IT services back to a single username and password, needed to consolidate all data into a single infrastructure, and needed to be accessible from anywhere.

"Because we are in so many locations, the biggest frustration for people was them saying 'because we move our employees around a lot between projects, every time they move we have to reconfigure their systems'," Cowie says. "When that all sunk in, there was no infrastructure I wanted to keep. We were starting from nothing."

Building on pillars
In a fit of planning, Cowie began planning a technical infrastructure to match the business requirements his efforts had identified. Business goals were grouped into a "four pillars" philosophy in which the project would be built around four key functions: mobility, centralisation/virtualisation, integration, and collaboration.

To deliver the new environment, which in 2003 was christened "Global One" and commenced as a five-year rollout, Cowie identified four key infrastructure areas: communications, network core, delivery, infrastructure and systems. Four key vendors were identified to supply these components: Telstra for communications, Novell for the network core, Citrix Systems for delivery, and IBM for infrastructure.

As part of their selection, Cowie was clear from the beginning that each vendor needed to work closely with every other to deliver and integrate their offerings into Macmahon's new systems infrastructure. "We needed top commitment from everyone," he explains. "This is where we created the very specific partner model with vendor responsibilities, our responsibilities, joint responsibilities, and outcomes that would deliver us a win/win scenario."

This approach required an unconventional approach to areas such as budgeting and margins, with Cowie expecting -- and getting -- openness from vendors in exchange for a commitment to ensure a fair go from the projects for all involved. By taking this approach, Cowie not only ensured that vendors had skin in the game, but he was more willing to embrace new technological approaches with the knowledge that the vendors would be right behind the company and its users.

The Linux step
Willingness to embrace novel solutions became a real benefit for Cowie as discussions with Novell hit on the top of Linux, to which Novell has become intensely committed since it purchased SUSE Linux four years ago.

Novell's range of identity management, desktop management and other tools all run on top of what the company now calls Open Enterprise Server (OES). And while some Macmahon consultants were still sceptical about Linux, Cowie says the careful four-pillars approach gave him the comfort to embrace the platform.

"Novell wanted to talk about getting Linux into the environment, and we had no issue with that," says Cowie. "At the time of the decision, a lot of advisors said going with Linux was a gamble and that we were going too cutting edge. But we believed in the partnering model and that it would get us through whatever issues we would encounter."

Citrix Systems, for one, was informed that it would have to work in its Windows-based product with the Linux-based Novell product portfolio. And IBM, which has significant Linux experience of its own, was called upon to ensure the Linux suitability of infrastructure including servers, storage area network boxes, tape libraries, and more. This proved immensely useful throughout the project, since all three infrastructure players had the enthusiasm and the experience to make Global One a reality.

Close collaboration around Linux led to some very real tests of the partner model: at one point, for example, Macmahon was having trouble with support for some redundancy features in its network. After escalating the issue to Novell and IBM, a conference call was organised within hours, bringing in Novell's lead SUSE developer to help resolve the issue.

"Everyone has wanted to see this succeed, and has bought into the vision and into the fact they have their own piece of it," says Cowie. "If we hadn't had the strength of the partnering model, our choice to go to Linux would have been a gamble: it was new, and other companies were supported out of the US. But the vendor pillars have been the key reason why this has succeeded."

Proof of life
The switch was flipped on Global One on 1 July this year, on budget and six months earlier than initially expected. Since then, the company has enjoyed a robust infrastructure that has centralised user authentication and desktop delivery, using Citrix Presentation Server to give employees access to their own desktops no matter where they travel.

Telstra has supported the rollout with a variety of infrastructure, ranging from 6Gbps fibre links between Macmahon's two datacentres and 1Gbps Office links and Internet pipe, to Next-G routers for remote offices and Next-G data cards for mobile users. A future upgrade will use this infrastructure to move Macmahon's voice infrastructure to voice over IP.

The benefits have quickly added up, according to Cowie. Company data is stored on one network, and backed up continually with full datacentre disaster recovery methods in place. New software can be rolled out within minutes thanks to the thin-client environment, which has extended the useable life of desktop equipment and specialised thin terminals. And, because connectivity is no longer a problem, users have been able to collaborate more effectively than ever.

OES -- which now runs every user and desktop management product in Novell's portfolio as well as Macmahon's DNS servers, access gateways, Web site, file-and-print and other servers -- has played a major part in making all this happen.

In the long term, the company will convert at least 90 percent of its business to thin terminals, which are gradually replacing notebook and desktop PCs. Within a year, Cowie believes the company's entire network core will be running under Linux, as alternatives to the few Windows-only servers are steadily identified and implemented.

"We've had great stability from the Linux platform," Cowie explains. "The kit runs and works well on the Linux clustering, and we've had no issues from file and print servers and so on. The performance we've received out of Global One has exceeded expectations, and we've got a cohesive network. People in remote locations have said we've delivered more than they expected we would, so it has worked out well."

[? template("/zdnet/insight/linux/header.htm"); ?]

Case study: Linux teaches QUT the easy way to scale

With tens of thousands of users who are often highly taxing on server infrastructure, managing IT in a university environment has always been a challenge. At Queensland University of Technology (QUT), however, this challenge has gotten easier in recent years thanks to a concerted shift away from proprietary Unix systems onto more-scalable clusters of standard Linux servers.

Snapshot on www.qut.edu.au

  • Industry
  • Employees
  • Operations
  • Financials

Education

QUT's more than 40,000 students and staff pursue research in a range of disciplines, many of them technical and therefore involving large volumes of data processed by purpose-built applications. Since each project has different requirements, the Information Technology Services (ITS) division found itself having to haphazardly add, and administer, all sorts of servers running different operating systems. Digital Tru64 UNIX, HP-UX, IBM AIX, Sun Microsystems Solaris, various flavours of Unix and several shades of Windows were amongst those dotting the university's datacentre.

Although they were effective computing platforms, these servers had intrinsic processing, memory and storage limits; when those limits were reached, the only solution was a very expensive one: "We were putting applications on midrange machines, but every time we got close to running out of server capacity, we had to buy another big box," says Joe Dascoli, associate director of ITS. "This wasn't an effective way to use the dollar of the university."

Strength in clusters
It didn't take long before the QUT team hit upon an approach that would provide a more cost-effective, flexible alternative to the university's proprietary Unix systems. That approach was based around Red Hat Enterprise Linux 4, which was loaded onto standard Intel processor-based servers co-ordinated by Oracle's RAC (real application clusters) technology.

RAC, an adjunct to Oracle's ubiquitous database, manages large numbers of standard servers as a single pool of computing and storage resources. As servers are added and removed, the system automatically redistributes the computing and data load across the remaining servers, providing a highly fault-tolerant and scalable computing environment.

For QUT, this approach was ideal from the start -- particularly given that the university was already using Oracle Financials and database in a limited sense. This meant the transition from its hodgepodge of technology standards to a fully Linux based environment could proceed relatively quickly and painlessly, without requiring the complex configuration and management issues inherent in the previous architecture.

"We've dabbled in the clustering technology well before this," says Dascoli. "What wasn't quite there for us was the way the application could quite legitimately sit over the multiple boxes in clustered form. Now that Oracle has invested a lot in that area, it's no longer a story; it's for real."

QUT introduced a policy favouring Linux as its preferred operating system, and over time the university has gradually reduced the presence of proprietary Unix servers within its datacentre. Just "two or three traditional IBM boxes" are still chugging away, with Linux-based clusters filling up the bulk of the datacentre's occupied real estate.

The new strategy has made it much easier to specify requirements for new projects -- and much easier to pay for them as well. In one current project -- development of a new student system -- the tender included a requirement for 30 to 40 Linux servers that would be used in an Oracle RAC cluster.

"It sounds like a big number, but the reality is that the cost of those 30 to 40 servers is a fraction of the cost of the old systems," Dascoli says. "In the old days you would have to pay AU$600,000 for a high-end server; now, the Linux servers are AU$3,000 each. This is the whole idea of being able to truly plug and play [new servers]."

The university's shift to Linux-based clusters has proven particularly useful in accommodating severe peaks in demand that are typical of university environments. With 40,000 students hammering the uni's servers for schedules in the weeks leading up to exams, and then again hitting the systems for results, those servers need to be able to quickly grow and shrink.

Because the cluster automatically integrates additional servers, ensuring performance during peaks only requires new systems to be installed -- and, because those systems run the free Linux operating system, doing so doesn't incur any additional cost apart from the hardware.

"If you notice that the peak has hit, you can add some more boxes to the cluster," says Dascoli. -It's not hard to pull boxes away from other services and redeploy them [in areas of need]; you don't have to physically relocate them, but can reallocate them virtually to some other app."

"It's like buying a tractor with 100hp but only ever using 2hp," he continues. "If you haven't got that 100hp once a year when you need it, your business goes broke. Being able to buy lots of little pieces, then bulk them together in whatever configuration we want, has been of strategic importance for this uni."

The Linux world
As at most universities, Linux was nothing new for QUT, having been used in scattered projects across the institution for some time. However, the shift from other operating systems onto a mission-critical Linux core still represented a major change, particularly given the traditional price and performance premium of high-end Unix servers.

"At the end of the day you're using a tool," Dascoli explains. "You do get that super-efficient functionality in some of those proprietary operating systems, but when it boils down to it you probably don't need those most of the time. The Linux platform is a very cost-effective option to any business that wants to go down that path; if it works 99 percent of the time and gives you bang for your buck, it has done the job."

Although Linux has become the standard for enterprise applications and services, the university still maintains some Windows servers for running .NET and other Windows-specific applications. However, in the main the uni's move to Linux has proved to be a valuable direction for the future by allowing it to leverage Oracle's RAC experience -- and not the hooks of complex high-end operating systems -- to provide the scalability, configurability and performance it requires.

"If you take an enterprise architecture view, and you have a look at the technology stack, we typically consume what's in our enterprise architecture on the basis of what's tried and proven," Dascoli explains. "Linux has certainly been that, and it is now embedded, explicitly and implicitly, in the enterprise architecture. We keep on using it because it works."

Editorial standards