/>
X
Innovation

Andrew Morton opens up at LinuxWorld

The Linux kernel’s No. 2 developer said there are no plans to embrace GPL 3 or change the kernel development process.
Written by Paula Rooney, Contributor on
linuxworld-andrew-morton-008.jpg
The Linux kernel’s No. 2 developer said there are no plans to embrace GPL 3 or change the kernel development process. And he tried to dispel concerns that the Linux kernel will fork.

In his 90 minute talk and extended Q&A session opening up LinuxWorld Expo 2007 Monday afternoon, Linux kernel maintainer Andrew Morton said Linus Torvalds liked the latest drop of the General Public License, or at least he doesn’t see it as “crazy” as it was originally proposed. But there isn’t much motivation within kernel.org to support GPL3 unless there is a challenge to licensing, and that’s unlikely, said Morton, a kernel maintainer at the Linux Foundation.

The GPL3 and the Microsoft-Novell accord that inspired some of its provisions ignited a great deal of controversy but mean little to developers and end users from a practical vstandpoint, Morton suggested. “I haven’t seen any consequences of the Microsoft-Novell deal, “ Morton said, when asked Torvalds’ second in command reports. Is he right? Are the GPL3 and Microsoft-Linux deals insignificant events that have little impact on day-to-day Linux business?

Morton also said he will resist any effort to change the kernel development process. At the Linux kernel summit in Cambridge, Mass. next month, for example, Torvalds’ right-hand man said he plans to stand up and publicly recommend against any change to the process.

Despite growing concerns about stability, the Linux kernel organization made the “correct decision” to abandon what evolved into a two-to-three-year release cycle in favor of a more fluid, incremental method of updating the Linux kernel in the 2.6 era.

Morton said the kernel group continues to debate and discuss the tradeoffs of stability over releasing cutting edge technology. But he insists the current rate of change for the Linux kernel is the right model, which has been refined after much experience. The value of open source is that it allows distribution vendors and end users to to fix, modify and customize code as they see fit.

“We have elected to go for the high change rate rather than sacrificing change for stability. Back in the 2.4 day, kernel developers were trying to develop new kernel technology and offer a stable productized kernel. This didn’t work,” Morton told hundreds at the Moscone Center in San Francisco, where LinuxWorld 2007 officially opens on Tuesday.

“We were trying to release product in the Linux 2.4 series and also work on developing the kernel for 2.5 series. It didn’t work with a three year release cycle … the stable kernel was far to old [when it was ready for deployment], years too old. Multi-year release cycles forced the Linux distributors to back port thousands of 2.5 patches into 2.4 kernels, "Morton pointed out.

But others say the interoperability snafus growing in the open source world due in some part to the proliferation of different kernel variations could pose an even bigger problem for enterprises than the API wars among proprietary software vendors in the 90ss. And he’s not talking about Windows-Linux interoperability. The proliferation of security patches and application updates that employ different versions of the Linux kernel has led to an increase in applications breaking in the field, maintains Chris Maresca, partner in the Olliance Group, Palo Alto. In addition, end users who download necessary updates to fix those glitches often end up with a modified Linux operating system that is not supported by their distribution vendor. "The lack of integration is the biggest issue facing employers of open source today," Maresca reports. "There was never good integration between proprietary software since each vendor - Oracle, Microsoft and IBM - tried to be a silo. Open source has the same problem and may be worse because there's no commercial incentive for open source vendors to work together." "It's a complex and common issue," he reports. "OpenLogic and Spikesource go some of the way to fix that issue because they provide integrated tested stacks, but it's not a complete solution." Is that an accurate assessment?

Morton also sought to dispel rumors that the Linux kernel could fork.

By his estimate, no rogue software vendor would have enough clout to fragment the industry. The only way a fork could be sustained is if a large group of organizations contributing to the kernel – at least 30 to 40 percent of the total – decided to split off and work together on a separate project. But there is no prospect of that happening and no one is conspiring to do that, Morton maintains. So why worry?

Additionally, and contrary to popular thinking, the debate over whether open source virtualization engines will fragment the industry is null and void since the kernel supports and will support all open source solutions – be it Xen, KVM, OpenVZ or VMware, Morton said.

Is Morton right? What are your thoughts? Are there factors that could lead to the forking of the Linux kernel in the near term? Do tell.

Editorial standards