X
Tech

Open source development is parallel processing in action

Knowing what tasks should go to what sized groups, and riding herd lightly on what those groups do, is as much engineering as it is conventional management.
Written by Dana Blankenhorn, Inactive

About 20 years ago engineers around the world began demonstrating what is still the greatest computer innovation of my lifetime, parallel processing. (For more on the concept, visit this course at the Cornell Theory Center, offered by Saleh Elmohamed.)

Until then people were limited to computers based on the Von Neumann architecture, a single integrated system. Parallel processing techniques let the job be split into pieces, and saw it finished faster despite the executive overhead.

Since then we have seen parallel processing techniques executed in many forms. In stacks of Macs. At Google. In "distributed computing" systems like the famous SETI@Home project. And now in chips.

What Matt Asay's piece on the mythical (open source) man month reminded me is that open source development is parallel processing in action. Projects which fail to scale are forked. Big projects are broken down into pieces. Time-intensive processes like bug-tracking are farmed out to the community.

For parallel processing to work central control must be light. This is almost never the case in a proprietary company, which has to serve a schedule, grow earnings on a regular basis, and manage an integrated product line.

As Bob Porras of Sun noted on his blog earlier this year, kids in the Internet generation parallel process routinely. They call it multitasking. As his son explains to him, it's done by limiting the attention a kid pays to his central management function. But there's a limit to how many tasks even a millenial can handle at once before performance degrades.

An open source development process, with light central control, lets you manage this challenge organically, in the way a grid computing stack does. Products like OpenQRM from Qlusters do it in an enterprise computing system. Management theorists should look closely at how.

An open source management style handles this well in the real world. When I wrote, jokingly, about the challenges of managing the Linux kernel, I was really watching Linus Torvalds try to manage a parallel-processing challenge.

Some tasks, the creation of new modules, are off-loaded to individuals, or small groups, then integrated into a release. Other tasks are off-loaded to large groups, to the community, like bug tracking and security management.

Knowing what tasks should go to what sized groups, and riding herd lightly on what those groups do, is as much engineering as it is conventional management. It's a continually changing crossword puzzle, as one programmer put it to me, or a constantly evolving game of Tetris.

But the key to understanding it remains parallel processing, and an open source process with light central controls, rather than a proprietary process with tight central controls, seems to be the way to go.

That's a lesson even a proprietary vendor can learn from.

Editorial standards