Tableau CEO sizes up database market

In-memory databases are gaining traction even though relational databases are still the norm and growing in corporations.

Tableau is quickly becoming the front end for analytics projects at multiple enterprises as the company delivered its first $100 million revenue quarter and upped its outlook. The company also has a unique perspective on the moving parts involved with databases, analytics and big data.

Speaking on Tableau's third quarter earnings conference call, CEO Christian Chabot sized up what its customer base is doing with databases. Tableau's ability to pull information from multiple stores give it a nice view on what's transpiring in the database market.


Previously:  Tableau delivers strong Q3, hits $100 million quarterly revenue mark

Among the key takeaways:

  • Relational databases are doing well and most Tableau implementations ride on top of them. 
  • In-memory databases are deployed more in Fortune 500 companies and gaining momentum. 
  • Unstructured databases such as NoSQL and Hadoop are in many pilots and hitting the production workload phases more.

Here's Chabot's full take after being asked by an analyst about databases:

We do have some sense of what our customers are working with in terms of data format and location through our support engagement, professional services and product input. So although our products are mostly on-prem, we have a pretty good sense of it.

Category one is all the traditional sources you'd guess. Don't call the death of the relational database yet. It's alive and well. So we do a lot of deployments against Oracle and DB2 and SQL Server and MySQL and Postgres. Those are alive and well and flourishing, and probably still growing exponentially at some low percentage basis.

Category two is more of the special trend, which is this new generation of fast databases that are often massively parallel. They are often highly distributed. They're often in memory. This new database revolution that is just taken hold in the last seven or eight years. It is working. I'm talking about Vertica and even Netezza is deployed quite a bit, particularly within the Fortune 500. SAP HANA we're seeing more and more. In fact, we just had a great victory against a massive HANA deployment. That one I would probably call out as most important new trend in data format and location, to your question.

Now, what you actually mentioned were more unstructured stores, and NoSQL and Hadoop. And we are investing aggressively there. We are doing business against those stores every week of the year. It's an up-and-coming trend. Customers are exploring those, they're playing with them, and they’re prototyping. Some are even in advanced production with them. We're investing in technology partnerships with those providers with the same spirit and ambition with which we embraced that previous generation I just mentioned.

We are working with the engineering teams at DataStax and Cloudera and so on down the line. We have the same hybrid data strategy as we have with the other stores in the sense that where possible we try to let customers query those stores live, where that is possible. And if not, people extract from those sources. Often many of them at a time into Tableau's own memory space. We mash that data up, and make it useful very quickly. That is the state of the union on data.