X
Business

In-memory technology more than just speed, it's transformational

Leading enterprise vendors, including Software AG, shift their stacks to in-memory, promising greater capabilities for cross-enterprise data analysis.
Written by Joe McKendrick, Contributing Writer

In-memory technology moves entire sets of data off of disks and into random access memory. Solutions have been around for years, but in the big data era, major vendors are increasingly embracing and building the approach into their offerings.  A technologist's delight, of course, but should the business care?

Speed is the most immediate benefit, of course. Making an entire data set available without the latency of disk reads, input/output devices, and network interfaces will make analytics applications screaming fast. With this in mind, Software AG announced, at last week's CeBit conference in Hannover, Germany, its strategy for the next generation of big data management, which it says will achieve speeds 1,000 times faster than current technologies.

The move is part of Software AG's BigMemory technology, the in-memory system for Java applications obtained through its acquisition of Terracotta, and integration with WebMethods Business Events, a complex event processing tool. (I was invited to CeBit as a guest of Software AG.)

Karl-Heinz Streibich, CEO, Software AG

Karl-Heinz Streibich, CEO, Software AG

In-memory technology has become the latest must-have piece for major technology stack providers. For a great explanation of what it's all about and why the business should care, listen to a recent Webcast led by Nathaniel Rowe, research analyst with Aberdeen Group. He pointed out that the benefits of in-memory technology mean much more than simply speed. A new survey conducted by he consultancy finds that organizations that adopted in-memory computing were not only able to analyze larger amounts of data in less time than their competitors - they were literally orders of magnitude faster.

“There are a lot of solutions out there, a lot of different vendors, and there are some significant differences in the solutions they provide,” Rowe says. “But the core philosophy behind in-memory is moving the data as close to the processors as possible.  To eliminate as many bottlenecks as you can, if you have data sitting in a data warehouse, or in a data center, another part of your organization, off site, even in the cloud, you’re restricted by how fast the disk is that the data is stored on. You’re restricted by the input/output device, even the application you’re using to process and analyze this data.”

Nathaniel Rowe, Aberdeen

Nathaniel Rowe, Aberdeen

“What in-memory does it takes the data you’re looking at and put it directly into the random access memory of a server.” says Rowe. “This eliminates all those bottlenecks. Because it's right there in the RAM, it allows you take advantage of some of the really amazing processors out there today. It allows you to put amazingly large amounts of data directly on the machine and access it practically instantly.”

Vendors such as Software AG aren't just looking to speed up customers' applications. In-memory technology may also help enterprises unravel the complexity that has built up across their infrastructures, Karl-Heinz Streibich, CEO of Software AG, said at a press conference held at CeBit. “Complexity in business models is an issue," he pointed out, noting that many organizations are still encumbered by "heterogeneous rigid IT architectures that haven't changed for a number of years. This is a significant factor increasing complexity. Business requires flexibility and agility in IT, which is not possible if classical IT structure is used." The ability to quickly access and analyze big data means businesses can more quickly act on opportunities and engage new markets -- without expensive attempts to rip and replace existing technology.

Rowe echoed this sentiment as well. “Speed isn’t the only story,” he says. In-memory technology “also has a transformational aspect on the business, on the line of business and the end users. End users in organizations that had this technology trusted the data more, they were more satisfied with their ability to access the information they needed when needed.  To make those self-service queries, trust those systems, trust quality and relevance of the data – that's really important to build a foundation of data-centric organization.”

Aberdeen's survey compared with status of data delivery among respondents with in-memory technology, versus those without. Companies with in-memory technology are more likely to report satisfaction with self-service queries (69% versus 48%); higher trust with data (63% versus 50%); quality/relevance (59% versus 42%); and anytime anywhere access (41% versus 24%).

A number of vendors are now beefing up their offerings with the in-memory paradigm. Software AG says its BigMemory solution supports processing up and down its solution stack, from business process design to cloud and software as a service. At the Software AG press conference, Wolfram Jost, CTO of Software AG indicated the integration of BigMemory and Business Events will be ready by the end of the year. The goal is not to release specific products, but rather a framework on which customers can build their big data capabilities. "Platform beats products," Jost says. "The difference between a platform and a product is that on the platform the processes are not preprogrammed, but can be programmed by the customer. We have a process platform, BPM platform, application platform, management platform, but not a product. Process and collaboration go together.” Hadoop clusters will also be supported in-memory as well, he added,

Editorial standards