Ask Fair Work Commission data director Noel Hanssens what he thought of the commission's former business intelligence (BI) platform and he would likely have described it as "messy".
But these days, since implementing a new enterprise business intelligence platform (EBIP) and shifting the platform into the cloud, Hanssens deems the commission as having a "robust and responsive" data platform.
"It's future needs-oriented. It's got much faster time to value, particularly in the real-time reporting stakes and the data model development stakes," he said, speaking at the Gartner Data & Analytics Summit this week.
"We've developed up a data language, which to degrees, future-proof us for artificial intelligence (AI), machine learning, big data, data science projects, and we've initiated a few of those as well. We've consolidated our BI toolsets, which has been a strength for the organisation, and centralised and cleaned up governance.
"So, we've got this new minty fresh enterprise BI platform that's not just improved our speed to developing out our data models, but it's also meant that some of the overhead we had -- just ordinary maintenance overhead -- in the old environment has decreased, and so it has increased our efficiency two-fold."
But getting to this point was not an easy feat.
Hanssens said the initial decision to overhaul the former BI platform was triggered by the commission's decision to develop an entirely new CRM platform, which was the source of 80% of all data that was fed into the commission's BI platform.
"What that did was give us a rare greenfields opportunity to build and consolidate our numerous BI tools. We were running about half a dozen BI tools at the time and all those tools were really too many for such a small team. To get us on a single stake was a peripheral benefit," he said.
At the time, it also meant that the EBIP platform had to be built in parallel to the new CRM system, which Hanssens said, was a positive outcome.
"That meant we could have a single source CRM in our data environment that could simply ingest data and away we would go," he said, noting it would also be a single place to point at for all future machine learning, AI, and big data models.
However, within a fortnight of the two systems going live, Hanssens said there were "significant issues" with the CRM system that forced the teams to roll both the CRM and the EBIP environment back into the old systems.
As a result, the teams were forced to rethink the design and architecture of what they had built. Hanssens said they determined what had to change was the architecture of its EBIP, from a traditional data warehouse extract, transform, load (ETL) to one that's closer to an ETL data lake with data marts built on the edge of that.
"The benefit of what this did was allow us to quickly ingest all of the data without the need to transform into our data lake, and then push out the data modelling into our tabular models and quickly churn that over," he said.
He also attributed part of the success in the end to being able to communicate to key stakeholders the reason for the project and keeping them updated during the process.
"It was key for us to communicate strongly the benefits of the new architecture, which meant quicker to value in the data modelling and also the delivery of dashboards reporting, so that bought us time and goodwill, particularly from key stakeholders."
Salesforce is pushing natural language and machine learning capabilities beyond the boundaries of its sales and service clouds. Here's a closer look at the latest Einstein releases.
Power BI adds new data protection features, that extend all the way to documents containing Power BI exports. Combined with new data lineage and recently-added dataset endorsement features, this feature puts Power BI onto the data governance on-ramp.
It's official - data analytics is swiftly moving to the cloud. AtScale is facilitating this, and CEO Chris Lynch shares his insights the hows and whys, in light of the unveiling of AtScale's Adaptive Analytics 2020.1