INTERVIEW: Matthew Leitch on Denial and Risk

INTERVIEW: Matthew Leitch on Denial and Risk

Summary: Big projects typically generate hundreds of pages of status reports—yet somehow it’s always a surprise when the project team ‘discovers’ that failure is right around the corner.In fact, says risk management expert Matthew Leitch, denial and information hiding can have deep roots early in the life of a project.

SHARE:
TOPICS: Security
0

Big projects typically generate hundreds of pages of status reports—yet somehow it’s always a surprise when the project team ‘discovers’ that failure is right around the corner.

In fact, says risk management expert Matthew Leitch, denial and information hiding can have deep roots early in the life of a project. According to Leitch, "By the time a problem has become real, people have often been hiding it for months as an undisclosed risk or uncertainty, and by that time they are already guilty of withholding information they really should have shared."

Leitch, who holds a degree in psychology and is a Chartered Accountant in the UK, spoke with Deck Chairs recently about the importance of dealing with risk-hiding and denial:

How common is this denial phenomenon and is it really something we need to talk about?

Nobody really knows how often projects get into difficulty as a result of people not communicating their uncertainties openly and honestly. Having said that, the bias towards over-confident forecasts has been studied in some detail. What we really need to know more about is how pressure from other people, and often from management systems, can make people much more likely to keep worries to themselves.

From case studies and my personal experience I’m convinced that what I call "uncertainty suppression" is one of the top reasons for project failures. The more you study the mechanisms the more obvious it is that this is happening all the time. Denial doesn’t just happen suddenly. It usually starts before a risk has crystallised into a live issue. That’s a crucial point.
Okay, but how does this tendency to play-down risk translate into real-world project failure?

Our bias to avoid reporting risk is like a set of mental blinders, giving us a rather narrow view of future possibilities. We see only a few of the possible outcomes of a project. The pressure of other people, and management systems, tends to make this worse and we often hide even the doubts we have. If we don’t talk about risks and uncertainties then we tend not to manage them, and the result is missed opportunities, struggling projects, and stress for everyone.

This all starts very early in the project. People who want the project to go ahead sell it to their colleagues or customers. In our culture it is common to forecast benefits from something and overstate the confidence we have in these forecasts - look around and you see this all the time.
We tend to see false displays of certainty as a communication skill, not as lying, which of course it is. Salesmen will state as fact, and with carefully rehearsed confident body language, forecasts that are highly uncertain. In this situation, we don’t like to talk about specific risk mitigating actions because this suggests that there are some risks to worry about!

Aren’t you being a bit harsh here? After all, many people are optimists and absolutely plan and expect things to turn out well.

The benefits people claim for a proposed project tend to be at the level needed to sell the project, but not much higher. This is because people fear being set difficult targets. Consequently, it is upside as well as downside risks that get hidden and then ignored.
In the interests of creating a happy and motivated project team we then tend to encourage people to talk only of success and adherence to plans and budgets. This again discourages discussion of alternative futures and their management.
And this leads to a situation where information hiding seems to become a reasonable way to avoid passing on bad news?
Let me give an example. You are working on a project, and as it proceeds a worrying situation develops. Do we report this upwards and risk loss of face, or keep quiet in the hope that we can sort it out, or we will otherwise get lucky and never have to say anything (the Nick Leeson syndrome). [Ed. note: Leeson was a rogue financial trader who brought down Barings Bank in the UK.]  People tend to keep quiet, and this may be because they see it as taking a chance to avoid a loss (of face), something humans are particularly prone to.
In summary, the problems start with over confident forecasts used to sell the idea, develop thanks to our misguided pursuit of a happy team, and eventually come to a head after months of secrecy when we must finally confess that the whole project is a serious mess.  Most experienced project people will have their own examples of all this. 
What about the field of risk management, which is oriented toward situations such as you describe?
Of course, there are traditional risk management approaches for handling these issues, such as relying on a risk register and its related risk management process. Unfortunately, these are totally inadequate for really addressing the issue, but there are many possible actions that may help.
It’s obvious that uncertainty suppression takes hold very early, so the sooner we intervene to encourage and enforce open and honest discussion of risk and uncertainty the better. The messages and example set by senior managers are probably crucial. It helps to talk about the project’s outcomes in terms of ranges and distributions rather than single points. One company recently told me that they have split responsibility for advocating business plans from responsibility for generating and executing them so that advocates do not have a personal motivation to oversell their ideas. 
You’re recommending that companies encourage open discussion of project-related risk as they arise. What else can managers do to reduce denial and risk hiding?
Another tactic is to use management methods that are hard to fudge. A typical risk register is highly subjective, hard to review critically, and can easily be distorted. So, to provide a more reliable guide to risk, it is helpful to identify, evaluate, and track risk factors. By that I mean things about the project or team that are facts now, or anticipated with confidence, such as past project track record in the organisation, the degree of organisational change required, the amount of new problem solving needed in the computer systems, and so on. The more objectively this is done, in terms of facts rather than subjective ratings, the harder it is to fudge the evaluation or its conclusions.
Why not just set strict goals and demand that the project team adhere to them?
One of the most alarming implications of uncertainty suppression is that trying to control projects using target cost and completion date - a very common method - may do more harm than good. One problem is that the targets focus attention on one possible future, at the expense of others both better and worse. However, I think there is more to it than this. Just recently I have done some research that has revealed that people who strongly believe target setting as a good thing are also more likely to approve of not disclosing risks and uncertainties. I don’t know why this is, but the implications are disturbing.

Matthew Leitch is a UK-based risk management consultant and researcher. For 7 years until 2002 he worked for PricewaterhouseCoopers as a specialist in internal controls and risk management. He is a qualified chartered accountant and holds a BSc in psychology from University College London. His website is: http://www.internalcontrolsdesign.co.uk.

-----

Topic: Security

Kick off your day with ZDNet's daily email newsletter. It's the freshest tech news and opinion, served hot. Get it.

Talkback

0 comments
Log in or register to start the discussion