The annual PASS Summit is taking place in Seattle this week. Nominally a third-party, SQL Server event (PASS once stood for Professional Association for SQL Server), it now focuses on the entire Microsoft Data Platform, in a semi-official capacity.
Yesterday, at his PASS Summit keynote event, Microsoft Corporate Vice President Joseph Sirosh made a series of announcements. One of them, which focused on Microsoft's new Cognitive Toolkit, was covered in depth yesterday by ZDNet's Mary Jo Foley. I'll cover the others today, along with a breaking announcement from Microsoft partner AtScale, which offers a SQL Server Analysis Services-compatible OLAP-on-Hadoop solution. All the news ties together nicely.
Also read: Is this the age of Big OLAP?
Also read: Kyvos Insights embraces Microsoft Azure, HDInsight
OLAP on Azure
If you're coming from the worlds of data warehousing and/or OLAP, both companies have news you can use. Microsoft, for example, is finally offering a cloud version of SQL Server Analysis Services (SSAS). The service is based on the same Tabular mode (column store) technology as Power BI's engine, but without the security, scale and other limitations of a self-service offering
Azure Analysis Services (Azure AS), the public preview of which was launched yesterday, finally offers an Enterprise-grade OLAP cloud service. Azure AS includes technology to connect to on-premises data sources, without which it wouldn't really qualify as an Enterprise offering. And while SSAS' classic multidimensional technology is not on offer, Microsoft has not precluded adding it sometime in the future. For some Enterprise organizations, such an offering would make a big difference.
While we're in the realm of Enterprise BI/DW, let's acknowledge that while specialized products like Microsoft Analytics Platform Service (APS) are very capable, a great number of customers are using SQL Server Enterprise as their data warehouse platform. In acknowledgement of this, Microsoft has released its Data Warehouse Fast Track (DWFT) reference architectures, a revival of similarly-named reference architectures from the SQL Server 2008 timeframe that came to Microsoft with the DataAllegro acquisition.
By providing strict guidance on CPU, flash-based storage and network configurations, the DWFT recommendations now permit a symmetric multiprocessing (SMP)-based SQL Server 2016 implementation to support active data sets of up to 145TB and maximum of 1.2 petabytes of data.
The prior sentence merits re-reading. We're not talking about a distributed, massively parallel processing (MPP) data warehouse here...this is a single-node architecture that nonetheless supports a petabyte-scale data warehouse. Your mind may not be blown, but mine is.
Migration and experimentation
Of course, to take advantage of DWFT, customers will need to be on the SQL Server 2016 platform, and many are not there yet. With that in mind, Microsoft is now offering a second version of its Data Migration Assistant (DMA), which supports migrations to SQL Server 2016 running on-premises or in an cloud-based virtual machine.
A companion tool, which Microsoft is calling the Data Experimentation Assistant (DEA), allows migration customers to compare performance of experimental workloads on SQL Server 2016 to performance of that same workload on an earlier version (SQL Server 2005 or above).
SQL Data Warehouse on Azure
Meanwhile, as cool as it is that you can run pretty beefy data warehouse implementations on SQL Server Enterprise, SQL Server MPP technology still beckons. To sweeten the deal a bit, Microsoft is now offering a one month free trial of Azure SQL Data Warehouse, the cloud-based version of that technology. Access to the trial is by explicit request, and is meant to support proof-of-concept implementations where customers bring their own data and have sufficient "tire-kicking" time before taking the plunge and migrating their data warehouse to the cloud in full.
OLAP on Hadoop
Speaking of free trials, OLAP and data warehousing, customers may wish to take a look at Microsoft partner AtScale. The company is led by Dave Mariani, the man who led two of the biggest SSAS implementations in history. A huge fan of the OLAP paradigm, but one who has lived the pain point of OLAP cube reprocessing times and scaling limitations, his company offers an SSAS-compatible product that implements dynamic cubes on top of Apache Hadoop and Spark.
Queryable using either SQL or MDX, AtScale builds virtual (non-materialized) OLAP cubes over data stored in Hadoop, and uses a combination of SQL-on-Hadoop technologies to satisfy the query. These virtual cubes can be implemented on various Hadoop distributions, including Azure HDInsight, and are queryable from various Business Intelligence client tools, now including Power BI.
AtScale says it's "the first and only provider to enable Power BI users to perform live, interactive, queries against data in Hadoop without data imports, pre-processing or data movement." That would imply that it supports Power BI's DirectQuery technology, enabling queries to be dispatched to the AtScale virtual cube dynamically, eliminating the need for the data in AtScale to be imported and materialized in a Power BI model.
Of course, to try it all out, Microsoft customers will need access to AtScale Intelligence Platform on Microsoft HDInsight. And so AtScale is this morning announcing a trial offer for that very product; customers can sign up for it online.
One way or another, Microsoft wants to offer customers an analytics platform. And that would seem to be more than rhetorical as, together, yesterday's and today's announcements cover every permutation of OLAP and data warehousing, on-premises or in the IasS and PaaS clouds, using SMP, MPP and Hadoop as the host platforms. Analytics now awaits customers quite flexibly, and just about any excuse not to implement is gone.