X
Tech

Cisco's Giancarlo and IBM's Mills: Actually, a certain amount of complexity can be 'worth it'

Gartner Symposium/ITxpo - San Francisco, CA - The main attraction on Day 2 of Gartner's semi-annual confab was what Gartner calls a Mastermind Panel.  These sessions are where one or more of Gartner's rock star researchers go on the big stage to put vendor executives on the hot seat in front of all of the conference attendees.
Written by David Berlind, Inactive

Gartner Symposium/ITxpo - San Francisco, CA - The main attraction on Day 2 of Gartner's semi-annual confab was what Gartner calls a Mastermind Panel.  These sessions are where one or more of Gartner's rock star researchers go on the big stage to put vendor executives on the hot seat in front of all of the conference attendees.    Today's Mastermind Panel pitted Gartner distinguished analysts Donna Scott and John Pescatore against IBM's software chief Steve Mills [below, at right] and Cisco's CTO Charlie Giancarlo.  In line with the event's overall theme (see Gartner dusts off old themes in Symposium kick-off) as well as the titles of the two interviewees, the panel was entitled "Conquering Complexity in Software and Networking. " 

millscharlie.jpg
While both men agreed that the existing complexities in information technology and the headaches that result represent a huge business opportunity to the vendors that can help to overcome them, perhaps not in-line with the theme was what Giancarlo said at the conclusion of the panel.  In his final word, Giancarlo advised attendees that the goal isn't necessarily to drive complexity out of IT, but rather to drive it out of pockets of IT that are insignificant to the organization's competitive advantage.  According to Giancarlo, if the return on investment on something complex is significant competitive advantage, then that complexity could very well be worth it.  The complexity that Giancarlo was referring to -- and that was largely the main subject of the 45 minute panel discussion -- are the layers upon layers of technologies and components that turn IT professionals into magicians in the course of getting everything to reliably work together.

Mills actually did a good job of describing how we got to where we are today -- where things tend to be so complex.  Although he didn't mention Google, the problem is that not everyone has the luxury of being a Google where the entire operation was a green field-one within the last few years.   As Mills described it, no company has the luxury of doing a wholesale swap on their entire infrastruture every ten years.  Instead, for many enterprises, today's IT is an amalgam of 40 years worth of investment dating back to the mainframes that were installed in the 1960s that were augmented (but not replaced) by minicomputers in the 1970s (perhaps the first democratizing force in IT),  with the addition of PCs in the 1980s, client/server computer in the 1990s, followed by the Internet and services oriented architectures (SOAs) in the current decade.  With each of the major technology iterations, it wasn't as simple as out with the old and in with the new.  Rather, it was stay with the old and add some new and after four decades, you end up with a lot of complexity.

Both men seemed to agree that it's very difficult if not impossible to drive complexity out. Instead, about the best you can do is you can mask it.  Much the same way the avionics in a cockpit can mask the complexity of the commercial airliner that today's pilots fly (a metaphor used by Mills), wrapping some mainframe application in a SOA layer masks the complexities underneath.  Using the extraordinarily complex process of creating security keys and assigning them to Wi-Fi-based workstations, Giancarlo talked about how it took Cisco's Linksys division two years (and a lot of work with other vendors) to hide a significant amount of complexity behind a single button (on Linksys' access points).

One trick then, advises Mills, is to figure out how to reuse as much of those masking layers or components as possible.  Mills claims that IBM is reusing 50 million lines of code across its products.   Echoing Mills sentiment's, Giancarlo said "the greatest contributor to simplicity or reduction of complexity is to make use of what's been done." 

Another conclusion of the session: With each new technology that promises to abstract a certain amount of complexity comes the opportunity to build more complexity.  In response to a question from Gartner's Scott as to whether or not the component proliferation encouraged by SOAs and Web services is the sort of thing that can cause rampant complexity, Mills again invoked the re-use sermon.   Long term, if those responsible for IT look to re-use as much as possible across their entire enterprise, the total number of components in play may actually turn out to be less than if IT continues with more of island of technology approach (with each island having its own litany of isolated,  departmentally proprietary components).
Giancarlo provided IP and HTTP as other examples of where the world of IT eventually managed to simplify the fiesta of protocols that used to travel across local area networks by narrowing them down to a handful.  But, almost as soon as enterprises were able to boil things down to IP and HTTP, complexity started to pile up on top of them. (Giancarlo identified XML as an example.)   That, of course,  comes back full circle to the question of where a certain amount of complexity is worth it.  If  IP and XML lead to greater interoperability of systems -- let's say across a supply chain -- but the number of systems that are involved in certain interactions and transactions  is by itself a new and significant complexity, that highly complex interoperation could mean the difference between winning and losing.  In other words, it could be worth it.

Editorial standards