HP burnishes vision on how products support both applications and data center lifecycles

Application owners, project managers, business analysts, QA team, performance team, and security teams -- all need to have input into applications requirements, design, test and deployent, said Rende. The HP products have been integrated and aligned to allow these teams to in effect do multi-level and simultaneous change management.

HP opened the second day of its Software Universe event in Las Vegas with "product day," but the presentations seemed more about process -- the processes that usher application definitions and development into real world use.

I've heard of applications lifecycle, sure, but the last few days I've heard more about data center lifecycle. So how do they come together? HP's vision is about finally allowing the operations and development stages of a full application lifecycle to more than co-exist -- to actually reinforce and refine each other.

Ben Horowitz, vice president and general manager of HP's BTO software unit, pointed out on stage at the Sands Expo Center that HP is number one in the global market for applications testing and requirements management for software development. And, of course, HP is strong in operations and systems management.

The desired synergies between these strengths need to begin very early, he said, in the requirements gathering and triage phases. Horowitz, the former CEO of HP's 2007 Opsware acquisition, also explained the fuller roles that business, security, operations and QA people will play in the design time into runtime progression. [Disclosure: HP is a sponsor of BriefingsDirect podcasts.]

I guess we need to call this the lifecycle of IT because HP is increasingly allowing applications requirements and efficient and automated data center requirements to actually relate to each other. You can't build the best data center without knowing what the applications need and how they will behave. And you can't create the best applications that will perform and be adaptable over time without knowing a lot about the data centers that will support them. Yet that's just what IT is and has been doing.

[UPDATE: Tony Baer of Ovum and OnStrategies has some good thoughts on these issues.]

Next on stage, Jonathan Rende, vice president of products for SOA, application security and quality management at HP, painted the picture of how HP's products and acquisitions over the past few years come together to support the IT lifecycle.

Application owners, project managers, business analysts, QA team, performance team, and security teams -- all need to have input into applications requirements, design, test and deployment, said Rende. The HP products have been integrated and aligned to allow these teams to, in effect, do multi-level and simultaneous change management.

Remember the 3D chess on the original Star Trek? That's sort of what such multi-dimensional requirements input and visibility reminds me of. Social networking tools like wikis and micro blogging also come to mind.

Rende then described how change management and process standardization in the requirements, design, develop, test and refinement processes -- in waterfall or agile methods settings -- broadens applications lifecycle management into the business and operations domains.

By allowing lots of changes to occur from many parties and interests in the requirements phase, the IT lifecycle begins in the requirements, but extends into ongoing refinements for concerns about, for example, security and performance testing. Also, the business people can come in and request (and get!) changes and refinements later and perhaps (someday) right on through the IT lifecycle.

I really like this vision, it extends what we used to think of simultaneous design, code and test -- while building advanced test beds -- but extends the concurrency benefits broadly to include more teams, more impacts, more governance and risk reduction. Without the automation from the products, the complexity of managing all these inputs early and often would break down.

HP's products and processes are allowing more business inputs from more business interests into more of the IT lifecycle. The operations folks also get to take a look and offer input on best approaches on how the applications/services will behave in runtime, and throughout the real IT lifecycle.

Because there's also portfolio management benefits applied early in the process, the decision on when to launch an application boils down to a "contract" between those parities affected by the applications in production, said Rende. This allows an acceptance of risk and responsibility, and pushes teams to look on development and deployment as integrated, and not sequential.

Horowitz further explained how HP's announcements this week around advanced change management and a tighter alignment with such virtualization environments as VMWare will allow better and deeper feedback, refinement and efficiency across the IT lifecycle.

This "IT lifecycle" story is not yet complete, but it's come a long way quite quickly. HP is definitely raising the bar and defining the right vision for how IT in toto needs to mature and advance, to allow the enterprises to do more better for less.

Newsletters

You have been successfully signed up. To sign up for more newsletters or to manage your account, visit the Newsletter Subscription Center.
See All
See All