Research tells us that 30- and 70-percent of IT projects fail in some important way, leading to global economic waste totaling perhaps trillions of dollars. These statistics highlight the lack of an effective industry-wide early warning system to prevent failure.
CIO hired gun and project turnaround specialist, Peter Kretzman, tackled failure prediction in an important article that is republished here as a guest post. CIO Magazine calls Peter, "a 25-year IT and online veteran, [who] shares thoughts on focusing product and application development as well as enhancing and maintaining world-class operations. He also points out that many departments survive by hiding inefficiencies, oversights and missed opportunities.”
In part 1, Peter laments the difficulty associated with predicting which projects will succeed or become challenged.
Part 2 (reprinted below) describes solutions to the challenge of detecting early warning signs that may cause trouble on IT projects.
In this guest post, Peter analyzes a practical solution to the IT project failure dilemma.
Wouldn’t it be great if there were some kind of codified, external measurement/evaluation tool that could methodically identify the kinds of disconnects that even well-led projects can fall prey to?
One that could pinpoint where the true risk areas are as the project evolves, and help people take targeted action ahead of time to address those problem spots?
That’s why I got so excited in a recent conversation with well-known IT failure expert Michael Krigsman, CEO of Asuret, a company that sells “technology-backed services”. He gave me a look at their forthcoming product, an impressively slick, well-engineered tool that in my view promises to provide exactly that kind of benefit: identifying where and why a project might fail in terms of some of those people/best practices aspects, before it actually does.
In a nutshell, Asuret facilitates a cross-sectional analysis of project participants and stakeholders as the project proceeds. By aggregating the answers to its carefully crafted questions and constructing a number of easy-overview summary charts, the tool then displays astonishingly insightful visual breakdowns that let you pinpoint major disconnects, such as between stakeholder groups and IT, or between actual project-specific and industry-best practices.
Let’s look at an example of what it shows you.
By mapping aggregated analysis results onto charted dimensions of importance and vulnerability, and slicing these charts by department, you can see at a glance in the chart below that there’s a disconnect: e.g., that executives think that the business case for the project has high vulnerability, while the IT participants view it as having low vulnerability. Early warning sign! And certainly better (more methodical, more aggregated) than relying solely on what you’ve heard Joe grumbling about in the lunchroom.
In the example, the disconnect looms large: look at the darker circle (representing the participants’ responses to questions regarding the project’s business case) and its different location on the two grids shown below:
This all sounds simple in this brief description, perhaps, but taken as a whole, Asuret’s methodical implementation and targeted, useful results are nothing short of groundbreaking. Perhaps other companies provide a similar product, but I don’t know of any. And frankly, I can’t imagine a better-designed or more perfectly suited product as Asuret to address the issues raised in this post.
I’m really looking forward to hearing more as they deploy and hone their product, because I can think of any number of large projects I’ve been on where this approach would have been revealing and useful.
It’s maybe not the ever-hoped-for holy grail, but it promises to be a small piece of it: an extension of our ability to see things before they happen. If Will Rogers had been an IT guy, I think he would have been excited too.