X
Business

Stream Computing - not just a load of babble

It was 8am on a bank holiday Monday morning that I was watching the IT news feeds and started seeing mentions of Stream Computing coming to the fore. Being a bank holiday, I though yes very good – cloud computing, waterfall methods, scrum practices (they all sound like creative labels) so let’s present “stream” as the next paradigm and make sure we milk it and talk about the Motorway Computing multi-lane collaborative approach as soon as possible right?
Written by Adrian Bridgwater, Contributor

It was 8am on a bank holiday Monday morning that I was watching the IT news feeds and started seeing mentions of Stream Computing coming to the fore. Being a bank holiday, I though yes very good – cloud computing, waterfall methods, scrum practices (they all sound like creative labels) so let’s present “stream” as the next paradigm and make sure we milk it and talk about the Motorway Computing multi-lane collaborative approach as soon as possible right?

But of course it’s no joke or silly label. In fact, ZDNet.co.uk reported on this topic as recently as this time last month and it is very real. The technology proposition here is that spiraling amounts of unstructured data such as emails, blogs, video, voice and other web generated content requires a huge amount of processing. Breaking off chunks of this data to process in batches (in the traditional way) produces a sort of staccato set of results that do not lend themselves well to real time processing scenarios such as financial market analysis, healthcare systems or weather monitoring.

This development is actually IBM’s baby and has, according to the New York Times, apparently been over half a decade in the making. Big Blue’s product based on this approach is called System S and it will no doubt go down a storm with particle physicists everywhere. Just a few days away from the company’s Rational Software Development Conference (which I will sadly miss this year due to travel restrictions) you might have thought they would have saved this news up for June – but it does in fact tie nicely in with IBM’s Smarter Planet campaign, so I guess they couldn’t wait.

Streaming in processing terms is not a new concept and even stream computing itself is a term that IBM has been using since June 2007. Added to that fact, ATI Technologies (before it became the AMD Graphics Product Group) has also been working in the high-performance, low-latency CPUs space on similar projects for some time now. That said, this may be the point at which the previously somewhat familiar term gets a little more commercially branded and driven by the corporate IBM publicity machine. There’s an IBM white paper here in PDF form if you need some bedtime reading on this topic. Beware the cheesy corporate spin though, the first sub head in this paper reads, “Harvesting pearls from the torrents” – you get my drift right? No pun intended.

Anyway, back to my Motorway Computing concept. This model sees software application development teams take a three or four lane approach to module construction. While the core foundational structure is kept in the leftmost “slow” lane, certain agile performance streams may “overtake” and accelerate in faster moving lanes. Reverse to right lane format for USA and our European continental cousins. Bug tracking is sidelined to the “hard shoulder” breakdown zone and periodic rest stops, or “service areas”, are intermittently placed for version control management. Project goal destination tracking is constantly monitored using a navigation box, which “speaks” instructions to team members in their podules throughout release iterations. Finally, nightly builds are completed with the team safely offline in the “motel zone”, where they can bond, recuperate and refuel for another days coding. It’s sure to work, just avoid the roadworks delays – that would be the customers of course.

Editorial standards