Just at the time when we are hearing more and more about self managing software systems, automation for lifecycle management, autonomic self-healing computing and pre-packaged tools and components I got speaking to a company today who assert that home-grown software is a major cause of IT downtime.
OK, so we know that reducing downtime is a perpetual challenge for IT departments, despite the advent of technology that helps companies to monitor IT services and circumvent hardware issues. According to Managed Objects, part of the problem is inaccurate software configuration data, which can condemn software updates to failure. Application-dependency software is now able to map out standard, 'off-the-shelf' software: enabling an accurate "packaged" software inventory to reduce the risk of change. But, so I’m told, home-grown software is the hardest to map and the most damaging to the business when it goes wrong.
If you believe the stats (conducted by Vanson Bourne looking at the retail, commercial and banking sectors) home-grown software accounts for up to 90% of a company's application mix. While off-the-shelf software products can be mapped out, these are rarely the systems that provide a company's competitive advantage. Meanwhile, home-grown software is the most rapidly evolving software as well as the most essential. Perhaps this is why IT departments often assigning huge man-power to manage software configuration changes.
So what’s the answer? More configuration management software, more change management software, more business service management tools, a refreshed approach to more nimble systems and rapid application development perhaps? If you are a vendor, I suspect the answer is all of the above.