Chris Benedetto, VP Marketing, Solstice Software, provided some thoughtful insights in response to my recent post on the SOA implications of HP's pending $4.5-billion acquisition of Mercury Interactive.While 60 percent of Mercury's revenues come from testing tools, this was only briefly mentioned by HP.
Benedetto observes that the HP-Mercury acquisition story "has mostly spun around SOA management, registries, and governance, tied, of course, to HP’s core competencies in centralized IT operations and service management." However, he continues, "testing was only briefly mentioned in the publicly released information by HP, although more than 60 percent of Mercury’s revenue comes from application testing tools. Moreover, this acquisition doesn’t truly provide end-to-end testing of SOAs."
While traditional testing looks at application performance, SOA testing needs to look at the whole business process, and if pieces of that process interact properly. End-to-end SOA testing, Benedetto says, involves "testing an entire business process path to assure that the integration has resulted in the intended execution of transactions, interactions, and data transformations," as well as "testing across multiple platforms, transport protocols, ESBs, language interfaces, and messages," and "Validating the linkages and integrations between business services and operational systems to meet target defect rates and SLAs."
There’s more to testing than reviewing and debugging code and interfaces for particular applications. Web services are not designed as islands; they exist to interact and eventually form the fine-grained building blocks of larger service-oriented architectures. The interaction and process flow needs to be tested.
A while back, I spoke to Optimyz's Narendra Patil about this issue, who observed that the task of testing grows more complex as Web services infrastructures grow. “If you have a large number of Web services end points, which you are developing with a reasonable level of complexity, then testing becomes an extremely manual process,” he explains. “For example, if you have 10 WSDLs, and each has 10 Web services operations, then you're talking 100 combinations. And if each operation just has 10 test data points, then you're talking 1,000 combinations. With the IDE and open-source tools, sending 1,000 requests, and getting responses and manually verifying them is a one-by-one, very sequential process.”
As SOA begins to pick up the heavy lifting of mission-critical applications and interactions, the heat is on to test for conformance and orchestration. Developers need better ways to effectively test the way these Web services work together. An SOA is a conglomeration of fine-grained components, and smooth functioning of such an application relies on dependencies and interactions between these components.
Benedetto observes that "without testing at the middleware, ESB, protocol, and message level, how can HP be sure that a SOA and supporting composite applications work as designed?"
"The question remains: will HP provide solutions for testing and validation of everything that is happening underneath the SOA or 'behind the screens'? It will be interesting to see if HP takes the next step in service management and delivery by combining network monitoring, registries, governance, and application testing tools, with true SOA and integration testing."