Fellow ZDNet blogger Mike Krigsman has an apparent job for life. Project failures are a part of the IT landscape to the point I am surprised that people have not become bored with the topic. Maybe they have and just see it as an integral part of the rich fabric that is the fashionista driven world of IT. Given the popularity of Mike's blog and his recent elevated status, I am likely wrong. At least in IT circles.
While I thoroughly enjoy reading Mike's analysis of what goes wrong I keep having this uneasy feeling. Why do we, as an industry and as IT consumers, keep doing this? Why do projects keep getting muffed? Why is there so little celebration of success?
At a media level, nothing works better than blood, spit and gore. Antennagate is a great example. SAP's Business ByDesign hiccups is another. The list is endless for those interested in sensationalism, a weak twist on past facts as a precursor to the future and anything else that will pump a few more page views. It is the world we live in. Or at least those who don't implement systems, go through the pain and come out the other side with fresh wisdom.
The fundamental problem with Mike's analysis is there for all to see. In providing an excellent analysis of the IBM/State of Texas meltdown he poses the question at the end:
What human tendencies drive us into situations of inevitable failure. Please add your voice in the comments!
It's a great question and one that should get us all thinking about causes rather than the symptoms that Mike understands.
As a social scientist by education and past practice there are a few things I've come to understand about the so-called human condition.
- We're all great at finger pointing - check this Tweet for a fabulous example.
- We're all well schooled in the art of ducking and diving.
- We live in a blame culture where the notion of responsibility has been diminishing rapidly for many a year.
Seasoned readers will know that I regularly rail at 'social anything' for reasons that will be well understood at the corporate level among many CXOs. What I have perhaps not explained is the why...the underlying reasons that contribute to my thinking and in turn why I believe that Mike's analyses, while good training fodder, will never wholly contribute to solving the problem.
Here's a cautionary tale to illustrate what I mean. Many years ago I worked as a voluntary prison visitor to some of the UK's most dangerous criminals in that they had all been convicted of murder, some multiple murders. Long story short (I did this for 8 years), it was only when they understood that in (most) of their twisted lives everyone was responsible but nobody was to blame could they start on the road to acceptance of their part in events and on to rehabilitation. The success rate in helping people in that position was far higher once they 'got the message.' Now let's apply this to project failures.
There is a slew of academic research that talks about corporate organization, culture, the psychology of business. There's plenty that talks about the dynamics of failure. Over at BNet, I found a very useful if academically worded paper entitled Exploring the Failure To Learn: Crises and the Barriers to Learning that says:
The tendency to search for scapegoats provides yet another barrier to learning. As with trust, projecting blame is indicative of an organization's ethical stance. Where an organization has a socially responsible approach, it will seek to explore the underlying causes of events rather than simply attempt to blame specific individuals. Yet examples of scapegoating abound. Union Carbide's technical account of the Bhopal accident suggested that it was caused by employee sabotage , a view described as technologically improbable by independent observers . Principal agents involved in managing the Hillsborough Stadium soccer event (police, soccer club, government), blamed spectators, seeking to absolve themselves from any culpability . In the case of the Challenger accident, NASA attempted to deflect blame away from itself; yet a subsequent inquiry into the accident showed NASA's culture was a key contributory factor that led warnings to be ignored .
In all of these cases, the process of scapegoating inhibited a holistic approach to crisis-related learning and may have resulted in a failure to change the core beliefs, values and assumptions of key organizational decision-makers. By projecting blame for the event elsewhere, organizations can perceive that they have isolated the supposed cause and deal with it directly. They have not, however, focused attention on management controls and the potential for latent errors that may have resulted from both management action and inaction.
At a psychological level, no-one wants to hear a curmudgeon. That's one of the great reasons for having an accountant or other finance professional on the team when evaluating projects. They are attuned to thinking about what could go wrong. I should know - I am an accountant by training making a career sometimes asking dumb-ass questions that winkle out hidden weaknesses.
Moving on...in more of the same 13 page article (seriously - read it) the author proposes that:
In an extensive review of crisis management literature, Pauchant & Douville  identified seven main themes. From the findings of this review, it becomes clear that research concerned with cultural and psychological aspects of crisis management, including organizational learning, have been most neglected . Perhaps the reason for this neglect lies in the tendency, until recently, to prioritize work undertaken within a formal, rational planning process and the subsequent development of technical solutions to complex, ill-defined problems. Clearly, learning from crisis events should move beyond narrowly defined technical solutions toward fundamental shifts in the areas of culture, cognitive representations and communications -- the human-centered, supposedly "softer" aspects of organizations.
[Emphasis added.] To me, this gets to the crux of what failures analyses miss and which also lies at the heart of what social computing fails to understand. If you agree that the extensive research indicates there are important psychological and cultural factors in play, then, taking that with what I have said above there are several conclusions that can be drawn:
- Confirmation of Mike's conclusion that the IBM/Texas debacle was inevitable appears eminently reasonable
- IBM's stand off position of having complied - again - eminently reasonable and predictable
- The assumption that systemetizing the problem resolution process can in someway bypass human response mechanisms is a fallacy
The real question is whether any of this was preventable. One of the biggest psychological problems we see is where mistakes get repeated. It cannot be because people are unaware. Given the levels of due diligence that are performed these days you'd have to be blind to miss some of the warning signs. And yet that is exactly what seems to happen. Fact is that software buyers are prone to making irrational decisions. I see this almost every day. Typically it goes like this: "I want the best but the cheapest," which is almost certainly a recipe for disappointment. In the IBM/Texas case I'd argue there was another factor in play. It gores something like this: "It's IBM, they're the world's biggest so they've got to be the best...ergo screw what they say they know we'll just trust them."
Expressed at that gutteral level and given Mike's causal analysis, the whole reason for Texas contracting with IBM would appear slightly short of insane. Yet that's what happens time and again. Is there a solution? I think there is.
- Contracts should require genuine risk assessment by people who understand risk, not those asked to sign off on a particular exec's fave project based against an IT tick list.
- Contract negotiations should be mandated as requiring an independent - and I mean requiring and independent - third party arbiter who has little or no connection to either party.
- That arbiter should have the power to recommend nix'ing a deal.
- That same arbiter should have the power to ask buyer management probing questions about the rationale behind buying decisions and have the answers put under scrutiny.
- Social profilers should be introduced into the equation so that the buyer has an understanding of the psychological risks it is putting itself under.
That's a start. Can you think of more?