X
Business

Software debugging costs rise; SOA blamed

How many moths would it take to bring down a SOA?
Written by Joe McKendrick, Contributing Writer

Ostensibly, the first computer "bug" was a moth that got trapped in a relay inside of Harvard's Mark II computer in 1945. It took technicians five and a half hours to track down the source of the problem. Assuming that a university computer technician may have been making $1.50 an hour at that time, it can be assumed that the bug cost about $8.00 to fix.

How many moths would it take to bring down a SOA?

Things are a little more costly these days, of course, and IDC appears to be saying that SOA is really adding to the cost of fixing software bugs.

According to a "forthcoming" report from IDC, cited in SC Magazine, a typical organization now spends between $5 and $22 million a year fixing software defects. IDC bases it's estimate on a recent survey of 139 organizations.

SOA gets a share of the blame for the escalating costs. The report cites "increased software complexity from multicore, Web 2.0 and SOA" that not only make bugs more prevalent, but also more complicated to fix. As IDC put it: "The increased complexity of software development environments and the cost of fixing defects in the field (rather than early in the software cycle) combine in exorbitant ways to drain income and to hamstring businesses as a result of critical software downtime."

Hmmm. The purpose of SOA -- and Web 2.0 for that matter -- is to simplify things. Theoretically, it means containing problems more to interfaces, versus having to go into the guts of enterprise systems to rewrite or upgrade code. A bug should in theory only have to be fixed once by the entity managing the target system; versus fixing it 100 times in 100 different instances across the enterprise. At least in theory.

Of course, with multiple islands of SOA and Web services, there is additional complexity arising. Just keep the moths away.

Editorial standards