MapR CEO talks Hadoop, IPO possibilities for 2015

MapR CEO John Schroeder offered a bit of Hadoop prognosticating for the coming year, while also hinting that an initial public offering was definitely in the cards.

For open-source Hadoop distribution vendor MapR, 2014 was a pretty solid year.

The five-year-old, San Jose, Calif.-based company dotted the 2014 timeline with partnerships, integrations, a fresh round of funding, and a 700 customer milestone - signaling that despite its relative shortage of tech news headlines, its approach to Hadoop is quite the thriving big data business.

Much like its Hadoop counterparts, namely Hortonworks and Cloudera, MapR's portfolio focuses on the enterprise. What sets MapR apart, however, is its range of proprietary features tailored to specific industries such as financial services, healthcare and telecommunications.

In all, MapR focuses on customers in eight vertical markets, but according to CEO John Schroeder, financial services, telecom and Web 2.0 take the top three spots in terms of sales for the company.

Investors are seemingly confident in MapR's industry-specific strategy, the strength of its portfolio, and its current expansion rate of upwards of 200 percent. Its most recent funding round in June, led by Google Capital, Qualcomm Ventures and Lightspeed Venture Partners, closed with $110 million.

Looking ahead to 2015, Schroeder said an initial public offering is definitely in the cards. Of course, no specific time frame was offered, but he said the company is on track for an IPO by the end of next year. His confidence is no doubt bolstered by the successful IPO of fellow Hadoop purveyor Hortonworks, which pushed the company's capitalization to $1.1 billion.

Here are a few other outlooks for the New Year that Schroeder and I discussed in a recent phone call:

Hadoop will go from batch to real time. Schroeder said that overall, enterprises using Hadoop are broadening their use cases and no longer want to design according to the limitations around base Hadoop. This will be a significant trend over the next year, he said, and one that he expects to be especially beneficial for MapR.

"More and more use cases are real time, and that really drives customers in our direction."

Data agility and self-service come in to focus. In the case of legacy databases and data warehouses, there's often a rigid structure that's difficult to alter over time. That rigidity has continued somewhat with Hadoop, like in the way customers had to build an app in order to use it. But with the advance of services such as Apache Drill, analysts can go in and start dashboarding without having to building an app. Schroeder said agility helps time to value from data, and going forward, that value focus will lead to further innovations in the space.

"Initial big data projects focused on the storage of target data sources. Rather than focus on how much data is being managed, organizations will move their focus toward measuring data agility." As for self-service, Schroeder said businesses will gravitate in its direction because it "speeds organizations in their ability to leverage new data sources and respond to opportunities and threats."

Fading hype. In 2014, Hadoop was celebrated with a proliferation of applications, tools and components, Schroeder said. But come 2015, the market will concentrate on the differences across platforms, the hype will give way to reality, and customers will figure out which products actually deliver business results.

"Hadoop is going into the trough of experimentation. The maturity is high enough now that customers are less prone to get confused by vendor claims and focusing on the really vital requirements. It could be viewed as a negative trend, but it's good where you are seeing early majority adoption of new technology."

Hadoop vendors consolidate. Consolidation in the tech industry is inevitable, and given the fluctuations in legacy Hadoop investments, it's a logical assumption to expect the same when it comes to big data. Underpinning the consolidation, Schroeder said, will be the realization that despite Hadoop's open-source foundations, it requires a significant engineering investment on the part of the provider.

"Some incorrectly viewed Hadoop as a commoditized product, but they didn't realize it requires a tremendous amount of innovation. Hadoop adoption globally and at scale is far beyond any other data platform just 10 years after initial concept. Hadoop is in the innovation phase so vendors mistakenly adopting 'Red Hat for Hadoop' strategies are already exiting the market, most notably Intel, and soon EMC Pivotal."