X
Tech

Discussion with Red Hat's Joel Berman and Nick Carr - 1st take

I recently had the opportunity to speak with Red Hat's Joel Berman, Product Management Director for Virtualization products, and Nick Carr, Director of Public and Analyst Relations. First of all, thanks guys for generously giving of your time to bring me back up to date on Red Hat's views of virtualization.
Written by Dan Kusnetzky, Contributor

I recently had the opportunity to speak with Red Hat's Joel Berman, Product Management Director for Virtualization products, and Nick Carr, Director of Public and Analyst Relations. First of all, thanks guys for generously giving of your time to bring me back up to date on Red Hat's views of virtualization.

I'm still digesting what we discussed but, here's my first few thoughts. As I am able to further integrate what I've heard and analyze it, I'll post more on this topic.

The first point I took away from our discussion is that Red Hat takes a very comprehensive view of virtualization that encompasses everything from virtual access software all the way through to virtual storage software. This view is based upon real-world usage of Red Hat's Enterprise Linux software rather than simply based upon what's hot in the media today.

Red Hat clearly understands that the mix of applications and the architecture of those applications differs between Linux/Unix and Windows and has tried to focus on the tools that would help organizations trying to get the most out of their Linux/Unix configrations.

Although what I'm about to say is no longer completely true, organizations still implement Windows applications as a set of services and assign each service to a separate machine or set of machines. For example, a database management system is unlikely to be hosted on the same machine as application services. The database engine, however, might be replicated on another machine in order to either increase the performance of the data management function or increase the reliability/availability of that function.

I must point out that Microsoft has made great strides in improving the reliability of Windows over the years. Directors of IT and IT architects have long memories and are chartered to "keep things going no matter what." They learned years ago that it was wise to implement Windows-based applications as a set of functions and assign each function to a specific server with either Windows NT and with Windows 2000. This is the basis of much Microsoft's virtualization strategy. It is a strategy that is largely focused on virtual machine software that allow many functional servers to share the same physical machine to improve machine utilization while also improving reliability and scalability of individual Windows-based functions.

Those same IT managers and architects learned that Linux, like its friend Unix, offer different reliability and performance characteristics. Due to those characteristics, it was safe to "load up" machines with all of the functions necessary to support a complete solution and, possibly, assign several complete solutions to a single large-scale machine. This is the basis of much of Red Hat's collective thinking on Virtualization. In Red Hat's world, clustering and virtual storage software is of greater use.

This is why much of Red Hat's focus is virtual access to computing solutions, high performance virtual application environments, clustering and availability software and virtual storage software. Their goal is helping Red Hat users to achieve the goals of performance, reliability, and availability in a Linux/Unix environment.

Red Hat is heavily involved with the Xen virtual machine software project and has included it in their most recent version of Red Hat Enterprise Linux. This software, however, is not their sole focus. It is part of a larger effort to help customers abstract complete solutions away from the underlying hardware.

Editorial standards