X
Business

Case study: Western Power quality tool

Despite having a quality management product on the booksat Western Power, no one was using it, causing the energy companyto have problems with software development quality.
Written by Suzanne Tindal, Contributor

Correction: This story incorrectly stated that thousands of test cases had not been carried out in the final stages of development. It was updated on 9 January, 2009, following statements by Western Power that have provided further information about the group's testing journey.

case study Western Power senior test analyst Paula Smith knew overhauling software development quality issues in the electricity utility would be a tough job when she got involved with the task in December last year.

But she didn't expect to find that while Western Power had a quality management product in-house, it wasn't being used on all projects.

Smith, a former test manager with Fujitsu, had been brought in by Western Power following an assessment that the group had conducted of its test approach, as part of its 'Chrysalis' process improvement program.

Following the review's findings that its test tool, Mercury Quality Centre, was under-utilised, Western Power CIO Leigh Sprlyan endorsed a business case to create a testing centre of excellence.

When Smith looked into the company's work flows, she told ZDNet.com.au, she found testing was being performed at the developer level, but was not consistent or repeatable. This was due to the lack of a centralised centre of excellence. She said developers were "throwing apps over the fence" with faults still in them, which static analysis and design should have picked up.

When she looked closer, she said she found many test cases still not done in the final three weeks of testing products.

Smith knew testing had to be going on somewhere, or the whole company would have fallen in a heap, but she didn't know where, and it couldn't be repeated. Western Power has later stated that many test cases were completed manually, but not using the Mercury Quality Centre Tool.

Apart from the dearth of testing, the decision on whether a product was ready for production was made arbitrarily by developers. "They were making decisions on the basis of it was getting late and someone was going to blame them," Smith said.

While Western Power had embraced concepts like user acceptance testing and was using change and release management techniques and the Information Technology Infrastructure Library (ITIL) set of processes, it needed a more effective system.

Western Power already had a quality tool to help with such issues, Mercury Quality Centre, which it bought to help it with its 2006 split into four separate companies: Horizon Power, the regional arm; Synergy, the retail business; Verve Energy, power generation; and Western Power, the network.

HP bought Mercury Interactive mid 2006 and dropped the Mercury name from all the acquisition's products in December of the same year.

However, use of the product hadn't extended past that project, Smith said, going instead into a "sleep phase". Of the 150 projects which Western Power had on the go, such as developing a new metering system or taking applications off the firm's mainframe, only 10 per cent made any use of the quality management tool.

"All of them had seen the product and they all knew it. They saw it as an IT tool," Smith said.

That wasn't the fault of the software, according to Smith. "The software itself has been able to drive the change ... the focus and culture of the organisation took away the ability of the centre to have an impact."

Western Power has later stated that the software needed a champion to drive adoption and positive change, with the testing expertise to mentor staff.

Smith looked at the software, and decided to undergo an AU$56,000 upgrade to Quality Centre 9.3, which she classed as more of an integrated management tool than just a testing tool. She instigated training for its users, being business divisions, developers, project managers and testers.

The upgrade meant integration possibilities existed with applications such as HP's service desk suite and Western Power's Oracle 10g database.

Once trained, employees could enter inputs, such as requirements and a library of test cases and faults to track how ready a product was. The faults were given a priority based on business risk, which would help decide whether to fix them before the product went into production or not.

The tool used its collected evidence to give Smith objective metrics for product readiness. "It tells me that I'm giving business what they want," she said.

It has also been getting positive feedback from employees. Since carrying out the upgrade and training, Smith has seen use of the software rise to 80 per cent of projects. Smith hasn't been pushing the take-up, she said. Instead projects have come and asked if they can use it.

The only projects which haven't used it are infrastructure projects and business intelligence projects. Smith said she hadn't figured out yet how to use the quality centre for this type of work.

She believed there was room for improvement in the requirements module of the software, however, since the company's business analysts were still writing requirements in Microsoft Word and then importing them into the tool. Smith said that if the interface was improved to include pre-filled templates it would reduce the entry workload.

Editorial standards