I still don't know what to call this decade, yet it's nearly over.
The 1900s were called the "aughts." This one is called the 2000s, which is silly because it will still be the 2000s when our descendants are partying like it's 2099.
Decade name fail.
Still, the calendar does not lie. The teens are nearly here. And what most IT managers want from the next decade is a big cut in the project failure rate, and a redefinition of what constitutes failure.
My C|Net colleague Matt Asay (above) was shocked, shocked to learn recently that half of all projects are considered successes if the software just works. Forget about doing something useful, or transforming business operations, or (god forbid) making money for you.
My wife is now finishing up such a project. It's been going on for ages. I probably know more than I'm supposed to about it, but all I really know is it's really, really hard and really, really complicated. Reminds me of the IBM 360 disaster. That's what taught project lead Fred Brooks about The Mythical Man Month, the idea that the more people you throw at a project the slower it goes.
Her company staged a big celebration a few weeks ago when the software first ran, which she found silly because full implementation is still months away, and there are miles to go before she sleeps. But I can understand the enthusiasm. Past studies have shown enormous rates of IT project failure, ranging (depending on how you measure it) from one-thirds to nearly two-thirds of all projects.
Failure can be spectacular. ZDNet's Michael Krigsman wrote last year that the largest maker of fire engines in the world went bankrupt over an IT failure. The city of Atlanta, where I live, is going through an election right now where one of the big issues is the failure of the city's computers to deliver a spreadsheet the council can use to analyze its own budget.
Big vendors use the risk of failure to justify their big buck contracts, but the failures noted in the paragraph above were both blamed on big vendors. One source of reaction against health reform might well be fear of IT failure. It happens to hospitals and it's painful.
At his blog System Architectures for Complex Enterprises, Roger Sessions writes that 2.75% of our GDP is in the form of IT, and that failed IT projects cost the global economy over $6 trillion last year.
Asay says there's a solution to all this. Open source. Which companies top the annual satisfaction survey of CIO Magazine? Red Hat and Google, the largest supplier and user of open source software, respectively.
Open source has spent this entire decade replicating what closed source can do, but it is now poised to go beyond it. What makes it valuable is not low cost. It's visibility. The fact that you, and your vendor, and even competitors, can help fix problems and develop capabilities means open source wins in usability.
With open source you can literally try before you buy. You can know the stuff works, that the first success hurdle is cleared, before you write a check to anyone.
What open source does in the enterprise is not so much lower costs as transfer them, from the front of the project to the back and beyond. It's adaptation, training, and support that really make the difference between a project that barely succeeds and one that really transforms.
The teens, or 2010s, whatever we decide to call them, will be the decade of open source.
This post was originally published on Smartplanet.com