Big data blowout: Microgen posts 7 billion transactions per hour

Big data blowout: Microgen posts 7 billion transactions per hour

Summary: Finance, telecom, utilities and media companies -- those with structured transactional data -- will benefit the most.

SHARE:
TOPICS: Hardware
2

British enterprise software firm Microgen likes to say that it makes "impossible things possible."

A good way to do that is to break performance records.

The company announced yesterday that it managed to process 7 billion transactions per hour using in-memory processing on IBM System x and its Aptitude enterprise application platform.

With Oracle database-to-database processing running on a combination of IBM System x and IBM Power Systems, the company's platform processed more than 800 million transactions per hour.

The tests were conducted over three weeks at IBM's Innovation Centre at Hursley in Hampshire, England.

Big numbers like that are naturally for industry boasting, that's true. But the feat also promises improved computing for the finance, telecom, utilities and media sectors -- things like intra-day P&Ls, dynamic pricing treatments, daily product and customer profitability analyses, and so forth -- which translates to a better way to handle ballooning data volumes and, with luck, a competitive edge, too.

If BYOD is the story of 2012, "big data" is definitely the one that's immediately behind it. The hype is huge, the potential seemingly limitless and the promises far-reaching. (Lightning-fast insights! Rapid application deployment! Cheaper maintenance! Legacy system integration!)

In fact, I'm hoping it will do my laundry, too -- then give me insight as to how efficient my wash was compared to the rest of the world's.

In all seriousness, though, the key to big data succeeding in handling scaling data volumes is staying one step ahead of them, itself scaling in performance. Exponential growth in input must be met with exponentially more powerful computing.

For now, Microgen's performance speeds are best suited for structured data such as transactions. But it's only a matter of time before that muscle works its way into other parts of the enterprise.

For the speeds 'n' feeds fans, I've included test details from Microgen below.

The tests were carried out using Microgen Aptitude Version 3.11 and Microgen Accounting Hub Version 3.05 Running on the following configuration of hardware, operating system and database:

Application Server

System Information Manufacturer: IBM Product Name: IBM System x3850 X5 / x3950 X5 -[7143RHX]-

OS Information: Distributor ID: SUSE LINUX OS: SUSE Linux Enterprise Server 10 (x86_64) Release: 10

CPU Information: CPU model name : Intel(R) Xeon(R) CPU E7- 8870 @ 2.40GHz Physical CPU : 4 CPU Cores : 10 Sibling: 20 (Hyperthreading ON)

Memory Information: Memory Size : 132060340 kB

Database Server

System Information

Manufacturer: IBM Product Name: IBM Power 750 Express (8233­E8B)

OS Information: Distributor ID: IBM OS: AIX 1 7 00F61EF94C00 (64) Release: 7.1 CPU Information:

CPU model name : IBM PowerPC_POWER7 Physical CPU : 24 Processor Clock Speed : 3550 MHz

Memory Information: Memory Size : 65636 MB

Software installation: Database: Oracle Enterprise Server 11.2.0.1 upgraded to 11.2.0.3 Client: Admin client installed on application server: version 11.2.0.1 About the tests:

Three main tests were undertaken using the following scenarios:

1. Using in-memory data

The source data was generated in-memory and the targets were in-memory buffers. The results indicated good scaling up to 7 billion transactions per hour. This test shows the raw processing speed of Microgen Aptitude unconstrained by slower input/output devices, it measures the upper limit of the processing speed of the current version of Microgen Aptitude.

2. Using data from file on a SAN

In this case Microgen Aptitude read source data from a file, performed some processing and then wrote the output back to a file. The files were located on the SAN. Each input/output record contained 50 columns and was about 370 Bytes long. The results delivered some noteworthy observations, notably that the absolute performance levels were very high -- over one billion transactions per hour.

3. Using an Oracle 11g database

When the Oracle database was used for source data and target data, Microgen Aptitude was still able to deliver a very impressive result of around 800 million transactions per hour. This was achieved using the same data and business processes as for the previous tests.

Topic: Hardware

Andrew Nusca

About Andrew Nusca

Andrew Nusca is a former writer-editor for ZDNet and contributor to CNET. During his tenure, he was the editor of SmartPlanet, ZDNet's sister site about innovation.

Kick off your day with ZDNet's daily email newsletter. It's the freshest tech news and opinion, served hot. Get it.

Talkback

2 comments
Log in or register to join the discussion
  • Cloud pundits do not wrap their heads around these types of transactions

    Great article. Cloud pundits who state that everything can be done in the cloud do not consider the Microgens of the world and their need for minimal latency and high throughput.
    Your Non Advocate
    • Great point.

      There are very, very real technical/physical limitations to these things. We must remain cognizant of them.
      andrew.nusca