Red Hat CEO talks virtualisation, Linux and the cloud

Red Hat chief executive Jim Whitehurst discusses why he thinks rival VMware will fail, how the financial crisis will be good for open source, and why cloud computing will be the future

At a relatively sprightly 40, Jim Whitehurst joined Red Hat from US company Delta Airlines, officially taking the reins on 1 January of this year.

The Red Hat chief executive has previously named cloud computing as his number-one priority. When ZDNet.com.au caught up with him recently, he added virtualisation to the top of that list.

Q: When you were last interviewed by CNET News.com [ZDNet.co.uk's sister site], you said your central priority was cloud computing. Why do you see this as important?
A: I do believe we are at an inflection point in the IT architecture. I am not sure it is quite as big as going from mainframe and terminal to client/server, but it's certainly a significant change in architecture.

Basically, if you look at the economics, there is no question that it is cheaper to generate a computing cycle in a centralised farm of commodity server hardware than it is in the current client server environment. That includes utilisation, the cost of hardware, the maintenance and support, and so on. I'm a big believer that economics will win in the long run, as we see more and more functionality move to large server farms that are centrally managed.

I guess whether they are internal private grids or clouds, or whether those are public, or semi-private, that's still to be determined, because that's a lot around specific business models. However, there is no question that functionality will drive a move to more cloud-type environments. I think building a new architectural cloud around propriety software is inadvisable at this point in time. If you're building a new architectural paradigm, why would you do it on propriety hardware and risk lock-in?

You offered virtualisation in RHEL [Red Hat Enterprise Linux] 5, but you aren't known for it. How do you plan to compete with VMware, Citrix and now Microsoft in this space?
VMware, no question, was first to the market. I frankly have run into no customer yet who is running XenSource.

We actually have a lot of traction. We are gaining quite a bit of momentum with our virtualisation. This is one place where Microsoft and Red Hat agree. Virtualisation should be part of the operating system, not a separate layer. I say that for several reasons.

First off, a big, big chunk of virtualisation is hardware enablement, and hardware enablement is something already being done by us — and the Linux community. The Linux community spends a lot of time — and hardware manufacturers and chip OEM manufacturers spend a lot of time — writing drivers and enabling hardware already in Linux and in Windows.

So adding a whole [additional] layer? To enable hardware and be involved in those technology roadmaps just doesn't make a whole lot of sense.

Secondly, a lot of long, hard work has gone into a whole bunch of areas within the operating system that are inherited by virtual guests. Let me take a simple example, security. The security regiment that is in [Security Enhanced] Linux was originally heavily contributed by the NSA. It is certainly considered the most secure operating system. The Russian military has RHEL certified as the most secure operating system available, period.

A virtual guest coming out of an RHEL [instance] inherits those [security] characteristics. The idea of trying to go back and completely rewrite a security paradigm to work with a hypervisor — I guess that's possible, but it'll take five or 10 years and thousands, or tens of thousands, of man hours. […] The Linux kernel is 9,000,000 lines of code. The average hypervisor is 35,000 lines. Our belief, and I think Microsoft agrees, is that you have all the work and engineering that has taken over a decade or more to put into the operating system. To now decompose it all, throw away the operating system and recreate those in all kinds of layers, is crazy.

It's a bit like, 10 or 15 years ago, people used to buy a separate TCP/IP stack, but these days you don't buy that, you expect that to be in the operating system.

I think in the next two or three years you will hear people stop talking about hypervisors, because they will just be ubiquitous in operating systems. I think Microsoft and us are really…

…the only two players who can already offer a broad full suite around virtualisation and the management tools, and the benefits in the operating system, security, and so on.

How do you strike the balance between profits and the community?
It's not a balance. Open source is obviously a very powerful development model. Why are we the only profitable, public open-source company?

I think the simple reason is that too many companies try to build a business model around selling free software. Frankly that doesn't work. Selling free is difficult.

We have built a business model that works on top of the cause of open source. What I mean by that is: in open source the code is free and freely available and copy-able. So it's not about the functionality, we don't provide the functionality to our customers: we provide enterprise-class software.

We take open source, and the phenomenal software that is developed in open source, and we make it consumable for the enterprise. We freeze the spec, we support it for seven years, we performance-tune it, we put it through batteries of tests, we certify hardware and software against it, we do localisation and documentation, we do service-level agreements, and finally package that with support.

Then there is the whole loop of working with customers to get stuff [into the Linux kernel] upstream. We also offer a software assurance programme for our users. All of those things — we make it enterprise-ready and consumable for the enterprise.

Because we have a model that is built around open source, we don't run into conflicts. Somebody is going to use something that is free. Take a prime example, CentOS, probably the closest derivative of RHEL. Even that community, if you go to their website, they will say if you are running a mission-critical application, go buy RHEL. It's not the same bits, and if you need the certification and support, you buy from Red Hat.

The more people using Linux, the more people part of the community, the better. If you run into conflicts, you don't get the model. I think that's why so few other companies have been able to succeed in open source. Because they are trying to sell software. When the software is free, it's really hard to sell.

Since we're right in the thick of it, tell me your thoughts on the financial crisis — what it means for tech, and for open source?
In the 21st century, capital investment for most western companies is IT. So the bad news is when things get tight, people stop investing as much in the future. I would expect to see a slowdown in spending for new functionality. That's the bad news.

Last week I met with a couple of CIOs who I probably would not have been able to meet with before this crisis, who both said: "We use no open source, we are a Microsoft shop at the operating-system level, we have our Java app server in place, but we're getting pressure now to reduce costs. We have to take costs out of our infrastructure. We need you to help put together an open-source strategy to reduce costs, now."

We see customers who are interested in replacing [Oracle's] WebLogic with JBoss, who are looking to stop server sprawl by moving more onto z Series mainframes on logical Red Hat partitions. Obviously, the big question for us is: "[With] the net of fewer projects and the expansion of open-source use in infrastructure, where does that leave us?" That's the big question that none of us know.

But what I do know is that relatively open source will be in much better shape coming out of [the financial crisis] than going into it, relative to our propriety competitors. I certainly think…

…it is going to get a lot more companies who may not have spent a lot of time looking at open source before, to looking at it and consuming a lot more open source. I think there is no question.

If you look at the basic operating costs of IT infrastructure, they have just gone up and up, basically because there has been this creep of additional costs. This is the other thing that I emphasise hard with customers — I think it resonates, because I think it is true — that we have developed a business model that is fundamentally more customer-friendly.

What I mean by that is, there is no vendor lock-in. By far our biggest competitor is not Microsoft, it's people who stop paying us but continue to use our software. If you think about that, our mission is to continue to get people paying the subscription, that means you need 'x' months' customer service, and you need to add functionality that they really, really want. Because if not, why pay us?

That is fundamentally different to a proprietary software company. The average propriety software company has to, every few years, drive an upgrade cycle, because that's how they get their revenue. So they'll add new functionality. It doesn't matter if the customer wants that functionality or not, but they have to drive an upgrade cycle. I hear from CIOs complaining all the time. Not about the licensing costs, but just going in to have to replace a piece of software that is running just fine, with a new one, because they are forced to do it.

But that's part of the gig with the average software licence model. The other one is, if you're a proprietary software company, the most powerful thing you have working to your advantage is lock-in. So you work really hard to make sure your products lock people in.

Our model is the exact opposite of that. Because we can't lock you in, we make sure we build choice for you. Because we don't have to worry about lock-in, we can really fight for standards and work to standardise and commoditise.

Not only is open source just cheaper, subscription costs versus licence costs, it helps to promote choice at other layers. The classic example is Unix to Linux: you can look at the cost of the operating systems, but the big value is being able to go from proprietary Risc hardware to commodity x86 hardware.

We are kind of doing that at multiple layers, and for subscribers who buy into the subscription, and understand the value of partnering with Red Hat, there are significant savings in areas in which Red Hat doesn't even compete, because we generate choice at those others layers.

Red Hat lives on servers, while Canonical is trying push Linux on the desktop with some success. Do you think Linux will ever be mainstream on the desktop?
Over time, I think it will be. I get asked a lot about that. At Red Hat we have some major, major instances of companies using an enterprise desktop, and we're certainly in the enterprise desktop space. I think the big question is — and I don't have the answer to this — is: 'What happens in the consumer space?'.

The reason I say that is, I don't understand why consumers should pay for Linux, Linux is free. The average consumer at home, people have gotten use used to the blue screen of death with Windows, right? If you're not running something that's mission critical, why should you pay for it, go download Fedora or download Ubuntu.

The big question is not whether it will be successful over time — certainly Linux desktops will take more share. The question is: 'Is that a commercial model for anyone?'

Newsletters

You have been successfully signed up. To sign up for more newsletters or to manage your account, visit the Newsletter Subscription Center.
Subscription failed.
See All
See All