A couple of days ago, I posed the question, "Can good design overcome the compulsion to simply throw new hardware and systems at a problem?" as it relates to the potential bloat of XML.
In this new post, Sachin Garg picks up on this theme with a thoughtful discussion of the options pertaining to XML compression. Garg agrees with the line of thinking that "problems which can and should be solved by better design should not get patched by using compression or binary formats." He adds that even though he is a "data compression enthusiast," other options need to be looked at for SOA:
"XML compression can be really great to 'patch up' the inefficiencies faced by SOA applications in handling large XML payloads, and fortunately there are solutions available. But nothing beats good design which can eliminate the problem at source."
In response to Garg's post, a reader surfaced this online paper, authored by Software AG's Michael Champion, that discusses the XML bloat problem, and proposed remedies, including code optimization, simplification, standardized alternative Infoset serialization(s), hardware acceleration, and hybrid solutions that maintain the human-readability of XML messages.
Champion keeps things in perspective: "Is all this handwringing about the pain of XML just 'premature optimization?' Looking at the XML application space as a whole, one would have to answer 'yes' -- in the vast majority of cases where XML is applied, parsing overhead and network bandwidth is not a critical bottleneck."
Earlier this year, ZDNet blogging colleague Phil Wainewright also took a hard look at various XML acceleration strategies, including XML appliances, binary XML, caching and event stream processing. How will they fare in real-world SOA? What works? The answer is a big "it depends."