This week at IBM Think, it will be difficult if not impossible to avoid the glare of artificial intelligence. With Watson, IBM has bet its reboot on AI. And with attention has come market pressure to make AI more accessible. We posted several months back about how cloud services are emerging to make machine learning accessible to non-experts. IBM has just joined the fray with Cloud Private for Data. We'll be collecting our thoughts from IBM Think later this week.
The high readership that we garnered from last week's post on the urgency for paying attention to the "soft" people and process side of AI reflects the fact that enterprises view it as the new imperative - they believe that they need the benefits of AI, not just for competitive differentiation, but survival.
The good news is that, in many cases, enterprises are already profiting from "smart" applications or tools that are already incorporating machine learning beneath the hood. They can benefit from AI without having to create, train, and run the models themselves. AI is becoming just as much a fact of life in enterprise software as it already is in the mass market consumer services where we already take next-best offers and automated recommendations for granted.
Exhibit A includes the applications that already run the heartbeat of the enterprise. AI is a core part of Salesforce.com's future, where it positions Einstein as "AI for everyone." AI is also core to Oracle's and SAP's next generation applications. The results are intended to adapt how an organization operates or interacts with customers.
For Salesforce customers, it will be about discovering insights about the customer to predict future behavior and tailor next-best actions. And that's just the tip of the iceberg.
SAP and Oracle are initially training their gazes on the back office. For SAP, it will be about optimizing tasks such as automatically pairing invoices with incoming payments or predicting the dynamics of contract renewals ahead of time. For Oracle, AI could help optimize supplier terms, rethink supply chain management by making it more adaptive and dynamic, or connect root cause analysis on the shop floor to preventive manufacturing disruptions. These examples just skim the surface of how AI could enhance the back office, as indicated by an active crop of start-ups.
BI tools are getting similar smart assists. Offerings like Amazon QuickSight and IBM Watson Analytics are using machine learning to provide a guided analytics experience that helps the user discover notable patterns in the data, ask the right questions, and automate the building of narratives through a natural language interface. Tableau has similar aspirations; unlike Amazon and IBM, for now its AI plans are a works in progress. But the next generation of Tableau will guide the user with recommendations based on the user's pattern of queries (akin to suggested offers on e-commerce sites), provide a natural language interface, automate changes to the data model, and offer automated discovery of hidden insights.
Data lakes provide another attractive target for harnessing machine learning. The sheer scale of data lakes compared to traditional data warehouses practically mandates some form of smart assistance; humans cannot simply scale to the task of cleansing or organizing terabytes of data. We recall at an old Strata conference a few years back Turing award winner Dr. Michael Stonebraker making the point that, when you try managing more than a handful of data sets, manual approaches run out of gas and machine learning must come in to help.
Not surprisingly, Machine learning has practically become table stakes for data preparation tools. For instance, Trifacta learns from the data and from how the user interacts with the data. Data catalogs have become popular jumping off points for organizations to find data residing in their data lakes; for instance, Alation harnesses machine learning as a complement to crowdsourcing, harvesting metadata and detecting common patterns for forming SQL queries. Unifi incorporates machine learning for recommending joins based on detection of common keys across multiple data sets, as part of its remit for profiling, preparing, and cataloging data.
A new entrant, Io-Tahoe, is also letting machine learning loose on a data catalog. But this catalog is aimed more at regulatory compliance, as it's (literally) trained on the type of personally identifiable information (PII) data that is becoming subject to privacy mandates such as GDPR, PCI, and HIPAA. Its tooling, being announced for general release today, is essentially an AI-driven sleuth for sensitive data. It starts, but doesn't end with crawling available metadata. That's the low-hanging fruit. Beyond that, Io-Tahoe's tool analyzes primary and foreign key relationships that may not be documented but are implied by data flows to validate to link common (and redundant) entities -- that's where the machine learning comes in.
Here's the common thread. As enterprises are actively seeking ways to ingrain AI into how they conduct business, the enterprise applications and tools that they are already rely on are, or will soon be, incorporating it for them.