X
Business

Software AG: Apama is crunching complex events outside finance

Software AG's Apama real-time analytics platform may have its roots in capital markets and trading but it's starting to establish new roles outside its familiar fiefdoms.
Written by Toby Wolpe, Contributor

The Apama complex-event processing platform bought by Software AG last year is starting to find unexpected new uses, now that it has been integrated with the enterprise software firm's other technologies.

One of the aims of the acquisition from Progress Software last June was to combine Apama's ability to correlate and analyse data streams in real time with Software AG's Nirvana low-latency messaging and Terracotta in-memory technology for sectors that need sub-second response times.

But interest in the technology and new prototype applications have not been confined to Apama's traditional heartland of financial trading, risk and market surveillance, according to Software AG CTO Matt Smith.

"Here we are more than six months later and we're starting to see the applications coming through," he said.

"I thought they would be just another way of the banks doing something, another way of the retailers doing something. But there's also been a lot of health and public sector interest."

Smith said organisations in sectors such as high-performance manufacturing, telecoms and health are interested by the Apama technology's potential to pick out trends from data drawn from large numbers of devices.

One Software AG project currently in prototype phase is the use of the Apama high-frequency trading engine — something normally employed in algorithmic trading in capital markets — by a UK health trust to process data from the smartphones of type 1 diabetes sufferers.

Individuals can use a low-cost kit consisting of a blood-testing strip and miniature blood analyser that plugs into an iPhone or Android smartphone loaded with software to analyse the data.

"So rather than doing it the old-fashioned way, I'm uploading the blood data to my phone and there's an app on the phone. What currently happens — without what we're doing at the moment — is the app will monitor things and with the more advanced ones you can upload the data to a website," Smith said.

"What we've done is we've taken the trading engine — the algorithmic engine — and we've taken feeds of this data and instead of having it on the phone, we're putting it into an event correlation engine and we're now cross-correlating your blood glucose level with other telemetry."

That telemetry can, for example, be the movement of the phone, showing how active an individual is.

However, the key feature of the project is the software's ability to compare one individual's measurements against those from large numbers of other diabetes sufferers, looking for patterns and trying to forestall trouble.

"Other patients with diabetes - what did they experience, when did they have a problem? We can pool their information and work out a trend that shows when you might have a problem," Smith said.

"Then we'll push to the phone, 'Could you just check your glucose level, please' when you might not have been planning to do that. 'We think you should, because trending-wise, algorithmically, you should check now'.

Under normal circumstances that individual might have carried out a test two hours later but the pooled diabetes data suggests it should be conducted now because of, say, increased physical activity or changes in the weather.

"The trend of data is different because of the environment you're in — and we're crowdsourcing the trends," Smith said.

"You might have to call a paramedic, or somebody around you, so you're going to use a health service to try and help you reduce the risk to your life, when straightforward movement of data into a complex event processing engine means you can be proactive about it."

Smith said it is fun to take something that's cheap — the diabetes testing kit used in the project costs about £12 ($19) — and combine it one of the most advanced algorithmic processing engines in the world to produce entirely new applications.

Those low-cost components contrast with projects in expensive high-performance manufacturing, where firms have been using sensors to generate crowdsourced data in the fabrication and maintenance of aircraft engines.

"They're using event processing as well. Two engines manufactured on the same production line: one goes in one aeroplane one goes in another. Those engines won't get the same experience. So how do you maintain them?" Smith said.

"What really smart companies are doing now is using the complex event stuff to take real-time telemetry off those engines so the moment a plane lands you know exactly what has happened for the past 10 hours and I know what to look for in terms of anomalies so I can proactively maintain those engines.

"They're thinking about it in terms of crowdsourcing, not just for that engine. If I've got thousands of those engines, I want to get that information for all the engines. I want to look for trends. So you're now doing broad maintenance rather than individual maintenance on one unit."

That approach can improve safety, improve customer satisfaction, and reduce costs since engines are maintained according to accurate measurements and trend analysis rather than a schedule that takes a blanket approach to all engines.

"I now know that if an engine has goes through this environment, I know that other engines that are in that environment are likely to experience the same thing," Smith said.

"You'll see the commercialisation of that outside big units like jet engines. Eventually you'll start to see that in commercial vehicles, in cars, in all sorts of places. We're seeing that trend already."

More on big data

Editorial standards