Quick question: What is more than 60 years old, but still spry enough to beat the best mid-range clusters at big data?
If your answer was the mainframe, then you're correct.
By a virtual show of hands, who loves mainframes? No one? Really? I get it. It's not really something that people love or hate; it just is. By another virtual show of hands, have you worked on or do you currently work with mainframes? OK, so just the five old dudes over there drinking prune juice and doing crossword puzzles? Fine. So, these days there's not much love for the mainframe. It has the same amount of appeal to today's cyber kiddies as brick and mortar stores do. In fact, a lot of us, including myself, have forgotten these churning data center behemoths of yesteryear. But did you know that for virtualization and big data, they're still the ultimate weapon of mass construction? They are and here are some recent survey numbers to prove it.
Based on the survey results, the mainframe may be the key to closing the Big Data processing gap and supporting the latest data delivery technologies. Over 89 percent of respondents indicated that mainframe's key benefit in the next year is its Big Data processing horsepower through CICS, DB2 and WebSphere, and more than a third see it as an opportunity to provide data to Big Data platforms such as IDAA, Netezza, Splunk, Spark, Oracle, Teradata and Hadoop.
The survey uncovered several other key findings about the mainframe's use in large businesses, where it still houses the lion share of corporate data, is the system of record and is now at the heart of their Big Data strategies, including:
- More than two-thirds of respondents (69 percent) ranked the use of the mainframe for performing large-scale transaction processing as very important
- More than two-thirds (67.4 percent) of respondents also pointed to integration with other standalone computing platforms such as Linux, UNIX, or Windows as a key strength of mainframe
- While the majority (79 percent) analyze real-time transactional data from the mainframe with a tool that resides directly on the mainframe, respondents are also turning to platforms such as Splunk (11.8 percent), Hadoop (8.6 percent), and Spark (1.6 percent) to supplement their real-time data analysis
Earlier this year, IBM launched a Hadoop-capable mainframe it describes as "one of the most sophisticated computer systems ever built." Further, IBM InfoSphere System z Connector for Hadoop will enable mainframe users to harness the power and cost-efficiencies of Hadoop. But big data and rapid analytics aren't the only reasons users stick with the tried and true mainframe platform. Security and availability are also major reasons for the longevity of the mainframe computer.
- 82.9 percent and 83.4 percent of respondents cited security and availability as key strengths of the mainframe, respectively
- In a weighted calculation, respondents ranked security and compliance as their top areas to improve over the next 12 months, followed by CPU usage and related costs and meeting Service Level Agreements (SLAs)
- A separate weighted calculation showed that respondents felt their CIOs would rank all of the same areas in their top three to improve
So if you thought the old mainframe had breathed its last blast of chilled air, you'd be sadly mistaken. Companies still use it and its use is on the rise, due in part to big data and analytics.
A CA Technologies study found that more than 75 percent of U.S. respondents and more than 80 percent of global respondents believe the mainframe is a strategic or highly strategic part of their current and future IT plans.
If you believe that you don't use mainframe computers, think again. If you put a debit card into an ATM, then you're using a mainframe. No, the ATM isn't a mainframe, but it's connected to one. Big companies still rely on the mainframe and they still rely on the data, the security, and the scalability that remains unmatched by any other platform or cluster of platforms in computing. To prove that it's worthy of new development and new research, IBM's new z13 mainframe was developed at a cost of five years and $1 billion. And no one but T. Boone Pickens or Bill Gates can afford to invest a billion bucks into a hobby.
For big data, nothing comes close to the processing power of the mainframe computer. If you don't believe me, you can download a white paper on the topic and check the facts for yourself or check out a video by Syncsort (The folks who supplied the survey data for this post). You can also grab a copy of a Forrester report on the topic big data analytics for IT.
I think by now you get the point that the mainframe is still a thing--a big thing--a big data thing. From now on when you think mainframe, don't think COBOL and Y2K bug, think Hadoop, think big data, and think big money for IBM.
What do you think? Does your company use mainframe computers for big data analytics. Let us know and let me know if your company would like to be featured in a future post about mainframe big data analytics.