The general public doesn't think of Department of Defense as a software developer but to an extraordinary extent, that's exactly what it is. Consider: Eighty percent of the development of the F/A-22 is software. The DoD spends $12 billion a year - 40 percent of its budget - on software development, testing and evaluation.
Just because defense is a software-intensive job, does that mean DoD has to spend so much money on software? No way, say two GAO auditors, who compared DoD to large corporations. According to Government Computer News, the auditors found that Motorola and other large companies spent only a few percent on reworking software.
This huge difference, according to Carol Mebane and Cheryl Andrew of the Government Accountability Office’s weapons acquisition audits practice, is based on a structured, replicable approach to software development that emphasizes requirements planning upfront. Three years ago Mebane and Andrew spent months studying how commercial best practices could be applied to DOD projects to control both cost factors and schedule slippages.
They spoke to an audience of software and systems engineers at the Software and Systems Technology Conference this week, revisiting the conclusions of their 2004 report, “Stronger Management Practices Are Needed to Improve DOD’s Software-intensive Weapon Acquisitions.”
Three factors make the difference, Andrew said: “A manageable environment, disciplined processes, and metrics, metrics, metrics.”
“In DOD, a project can be two years, three years, even four years long. It makes it hard for a program manager to get his arms around a project, [or to] get a handle on costs,” Andrew said.
The auditors emphasized the importance of setting requirements - corporations budgeted as much as 30% of total hours to requirements. But they were shocked to realize how infrequently high-level management reviewed software under development.
At DOD, on the other hand, major management reviews of software projects usually happened only once a year, or even two years apart. “We were shocked at that,” Andrew said. But when GAO “recommended that program offices should get involved more often instead of waiting for major reviews, there was resistance. ... The program offices didn’t have access to [software development status information], and didn’t look for it.”