X
Business

Tableau extends its footprint

With a new back end server, more connectivity, and a new data preparation capability, Tableau is growing its footprint across the enterprise. Is the company -- and its customers -- ready for the scrutiny that such ambitions invite?
Written by Tony Baer (dbInsight), Contributor
tableau-logo.png

Call us jaded, but we were caught a bit by surprise at the outpouring of enthusiasm among 15,000 Tableau conference attendees earlier this week. It's been a while since we hit an event with such sustained energy levels. The enthusiasm was in marked contrast to what we saw at Strata several weeks back. Held at Mandalay Bay casino barely a week after the Highway 91 Harvest Festival tragedy in Las Vegas, the event was an affirmation not to cave in to domestic terrorists.

In a way, Tableau's trajectory from desktop to the enterprise mirrors that of Microsoft a generation ago. The company's claim to fame was that it was actually the first to deliver on the original and long unfulfilled promise of BI: to put a dashboard on everyone's desktop, just as Microsoft was at the center of the revolution that snuck PCs through the back door until they became part of the standard IT infrastructure of the enterprise.

A presentation by JP Morgan provided a good case in point of the obstacles to self-service in a big enterprise: The company's BI reporting practices were stuck in a waterfall cycle, where new reports often took months for IT to produce. The notion of users having control over their own data was deemed subversive for an organization in a highly regulated industry. Like many large organizations, the bank had just about one of everything when it came to software, so it wasn't surprising that one of the business units had a copy of Tableau sitting as shelfware.

It was a long slog from the time that unit got Tableau up and running on Tableau for the rest of the organization to buy into self-service. And when they began, initially deployment veered out of control with several hundred desktop licenses before someone convinced IT to pick up Tableau's server product.

But just as the notion of a shared service environment, bringing IT in closer alignment with the business began taking off, disruptions like the London Whale and the onslaught of regulations after the housing crisis occurred. It grew clear that the bank could not allow self-service to jeopardize compliance with segregation of duties and similar mandates. A key strategy for surmounting this gap was giving each business unit its own self-service walled garden so users from student loans wouldn't see mortgage data by mistake. Then came a shared cost chargeback model that metered use, emulating practices that have become routine in the cloud.

Capping it off, JP Morgan is rolling out a Tableau center of excellence to promote best practices that keep the chaos in check, as IT formally embraces Tableau as officially endorsed technology. Meet the new boss, hopefully not the same as the old boss.

For large IT organizations, Tableau is often judged guilty until proven innocent. At first blush, self-service appears at odds with governance. Tableau's challenge is that while the content it delivers is high in the value chain, it is typically the last mile of the food chain when it comes to delivering that information. When it comes to directives like GDPR, where do you impose controls: At the point of consumption or further upstream where the information is stored? Addressing that question is a work in progress at Tableau, which has yet to formulate its strategy. In all fairness to Tableau, this is not exactly tops on the wish list for its clients, who themselves may not (yet) be fully aware of what they'll be facing.

Tableau's expansion of its mission continues, as it morphs from desktop tool to end-to-end platform provider. The highlights this year include a generational shift in its back end as the emerging Hyper technology replaces the old Tableau Data Extract (TDE) engine; a rounding out of its connectivity options with the Extensions API; and the impending expansion of its self-service footprint to go upstream with Project Maestro data preparation.

Hyper, which debuts in the 10.5 beta release, promises to sharply increase throughput and scale for Tableau data extracts; extracts are used when relying on direct connections to source databases becomes a bottleneck. Admittedly, Hyper would probably not be utilized if Tableau were working against in-memory platforms like SAP HANA or Oracle in-memory, which are already pretty fast.

Hyper came from an acquisition just over a year ago of a startup spawned by a research team at Technical University of Munich (TUM). Originally designed as a high-performance in-memory database for supporting transaction and analytic processing, Hyper adapts an age-old computing idea and embraces a current practice. It compiles, rather than interprets code, meaning that it optimizes execution at the machine code layer for sharply higher performance. And it makes performance more consistent by conducting extreme parallelization, ensuring that all cores are kept occupied while processing an inquiry.

The announcement of the new Extensions API largely rounds out APIs that Tableau has been creating to make it a more developer-friendly platform. While existing APIs already allow Tableau visualizations to populate third party applications, Extensions does just the opposite. It allows Tableau developers to pull in external functionality. So, for instance, you could add a textual narrative generated by Automated Insights to populate a Tableau screen, meaning you don't have to jump to a separate application to get the big picture on a query. Or if you are exploring data, you could pull up metadata from an Alation catalog.

Then there is the announcement that Tableau will soon extend self-service upstream to data preparation with Project Maestro. It's an area that overlaps Tableau into the domains of partners like Trifacta or especially Alteryx. Tableau is aiming to extend the visual, code-less experience of its data visualization to preparing data, providing similar forms of drag-and-drop capabilities.

Admittedly, the lines of demarcation are more clear-cut with Trifacta, which employs a specialized data wrangling language aimed at data engineers, as opposed to the line of business user that fits the Tableau profile. With Alteryx, that's an open question, as it caters to a more similar user target, not to mention the fact that Tableau is its most active partner (based on shared customers). Admittedly, Alteryx has deeper analytic capabilities, such as the ability to to give business users access to predictive analytics written in R, but it has also heavily promoted itself as a data preparation tool. We believe that some repositioning for Alteryx, highlighting its ability to embed Spark and R-based advanced analytics, should be in the offing.

We've always believed that business users must take more responsibility for preparing and curating their own data sets if the promise of self-service is to be realized. We've also believed in users taking responsibility is essential if data lakes are to fulfill their promise. But for Tableau, adding data prep posed a bit of a dilemma. It's not the kind of Extensions API-style app that could readily shoehorn itself onto a normal Tableau visualization screen. Yet, to keep the workflow seamless, you don't want to force users to boot up a separate application. And the last thing the world needs is another standalone data preparation tool. Ideally, visualizing and preparing data should be an iterative, rather than waterfall or sequential process. As you visualize data, maybe you need to go deeper in the data set and therefore whip that data into shape.

Tableau has yet to determine how it will package the Project Maestro capability, but if it's going to take advice from the peanut gallery, we'd urge them to make this a deluxe edition add-on rather than a discrete tool.

Editorial standards