In competitive environments where business
owners are keen to roll out new products as quickly as possible,
many executives feel frustrated with the time it takes
application performance testers to confirm that new applications
are ready for prime time.
At financial services giant Suncorp Metway, however, a new
approach to risk management has sped up the process by using a
risk management technique to help business owners and application
testers jointly assess the real risk posed by new
Performance testing -- which uses testing tools and simulated
user loads to pound a new application that has been confirmed to
have all the required functionality -- is normally a major
exercise because systems are generally tested from one end to the
other no matter how big a change they represent.
This approach, however, doesn't necessarily reflect the
requirements of the business, argues Philip Pooch, performance
test specialist within the Infrastructure Services division of
"Saying that you always need to test or that you never need to
test are both immature processes," he explains. "If testers can
get a feel for what they're planning and establish a risk
profile, a lot of issues can be sorted out. This is about the
right-sizing of performance testing."
During a recent upgrade to Suncorp Metway's customer
relationship management (CRM) system, for example, a move from a
proprietary system to a standalone J2EE-developed platform needed
to be validated by application testing using tools from
performance management company Mercury.
Recognising that fully testing every aspect of the application
would take a huge amount of time and effort, the testing team
worked carefully with the business owners of the CRM system to
prioritise components. Their mutual decision was to focus most of
its testing efforts on the high-risk J2EE components; the other
components of the solution would be tested less, but monitored
along with potential network bottlenecks to ensure adequate
In another project - enhancements to a loan document tracking
system - the risk was evaluated and found to be low enough that
performance testing was not judged to be necessary. Instead, the
team decided to focus on ongoing application monitoring and staff
training to avoid problems.
1 - Ostrich (lowest):
Performance testing is not considered.
Only interested in functional testing.
2 - Chicken:
Performance testing is considered, but an ad-hoc or inconsistent approach is taken.
Limited manual concurrency testing tagged onto functional testing.
No real load.
Possibly some manual production monitoring "hope and pray" testing
3 - Crow:
'Indicative' performance testing done for some apps.
Possibly use an automated test tool.
Fitted in at the end of functional testing.
Tested in a functional test environment that often only remotely resembles production-like capacity.
Work from vague performance requirements.
Some processes, but inconsistently applied.
4 - Owl:
Performance testing done consistently and well.
Use a near production-like test environment.
Create and maintain reusable automated scripts under version control.
Establish clear performance requirement.
Use repeatable processes.
Test a narrow range of applications well but restricted by tool, environmental and training costs.
5 - Eagle (highest):
Application performance testing seen as a risk reduction exercise.
View application performance as a lifecycle that commences with application design and runs parallel with a Projector Release.
Establish an Application Performance Risk Profile.
Provide performance testing solutions based on risk.
Get involved, even when they can't test.
Are trusted advisors.
Since it began running risk assessments 18 months ago, Suncorp
Metway's testing team has conducted more than 20 such
assessments, leading the business leaders to a new level of
understanding of exactly how complex and burdensome performance
testing can be. By working collaboratively, business owners can
set their expectations more clearly and ensure that higher-risk
elements get a higher priority from testers that are often
working to tight deadlines and budgets.
"The driver for this change is that performance testing
environments are really hard to set up," says Pooch. "We're all
IT people and IT people tend to go straight into solution mode,
but when it comes to testing that's not actually the best place
to start. We often hear talk about business-IT alignment, but
this is really business-testing alignment."
To support his work with the business divisions, Pooch has
established a formal methodology - which he has called PPTMM
(Phil's Performance Test Maturity Model) - that ranks the
maturity of application performance testing on a five-bird
Ranked in terms of increasing sophistication, the birds -
ostrich, chicken, crow, owl and eagle - reflect the maturity of
the risk-based performance testing philosophy.
Pooch outlines a simple approach to doing a performance risk
assessment. Working together, business and testing leads work
through all supporting documentation such as business and
functional requirements, conceptual solutions and diagrams.
Working in a face-to-face meeting, all stakeholders question
the structure of an application, reusability of components, the
project's importance to the company, and other risk-related
Using the bird system, risk scores are plotted on a matrix
that guides the decision-making process in a much more
consistent, repeatable and meaningful way than was possible in
the past. Scores are weighted, with each element of the solution
analysed tier by tier.
By using a "disciplined, repeatable process" to evaluate the
full aggregate risk within the environment, the teams work
together to determine a risk reduction strategy that can be
followed more smoothly than was possible in the past.
"Sometimes you can burst some bubbles [by pointing out the
real risk of a project] but by taking a risk approach, we've got
the business owners on board all the way," says Pooch.
"They used to say 'we need performance testing' and we'd say
'how much?' Now we're being more proactive than we used to be.
We're now seen as advisers rather than testers, so they business
will engage us really early and do a risk assessment before we
even give them an estimate [for performance testing]. The company
has very much pushed into the user experience, and is being
proactive about making sure it's good enough."