Commonwealth Bank of Australia (CBA) CTO Matt Pancino said having had the opportunity to reflect on the slew of regulatory issues that the bank has faced in the last two years -- from the results of the Royal Banking Commission to the findings of Australian Prudential Regulation Authority's (APRA) inquiry into the bank -- it has begun to look at how it can use data "the right way" to restore customer trust.
"Personally, and the way the bank views it -- and the way to think about our bank strategy -- is we will have to treat customers' data with the same care and diligence as we manage their money. That's what the future of trust actually looks like for us," he said.
As first steps, Pancino said the bank has re-written its data policies based on global enterprise data management standards, which he describes are like an "IT blueprint of how you should architect your systems for good data practice".
He said while the tech team is now charged with implementing these standards by documenting critical data elements, there is still a long way to go.
"If you work in a complex environment it won't be a surprise that some of the critical data elements we have require multiple source systems, so you have manual controls to be able to make sure you can verify that critical data element," Pancino said.
"Manual control is a euphemism for a spreadsheet.
"Clearly, these are ineffective, unacceptable, and it's the job of technology and architecture to drive these manual controls out. We have to drive it from an architecture perspective."
See also: Why Westpac is making 'frenemies' with fintechs (TechRepublic)
Pancino admitted that part of the root cause of CBA's past failings when it came to data was because the company did not having a proper understanding of data, despite spending "hundreds and millions of dollars" purchasing and building the infrastructure to host it and hiring data scientists to analyse it.
"We bought the storage, we bought the software, we enabled these capabilities to come about. But my question is did we take enough time to understand the data we were pumping into the [data] lakes and [data] warehouses [coming] from our core systems, our legacy platforms, our heterogeneous capabilities?
"Did we understand where the data was taken from? Did we understand who was collecting it? Was it accurate? Were only the people suppose to be looking at it as part of their jobs actually looking at it? Were we securing it? Were we encrypting it? When it was used, were we deleting it? Did we have the right control environments as we set down this path? You'd have to argue that we didn't.
"Because when the crisis hit, you're in this strange situation where data in the enterprise, in the IT systems, is still hard to wrangle, and yet we've spent millions of dollars on the right hand side."
Speaking at the Future of Financial Services event in Sydney on Thursday, Pancino also took the opportunity to highlight some of the ways the bank has started to use "data for good".
He said one example is how the bank analyses the 7.5 million logins that occur each day from the bank's mobile app to help customers make better financial decisions.
Pancino also pointed to the bank's customer engagement engine as another.
"It drives 'next best conversations' that help nudge customers and alerts them of particular events about when bills are due, how to avoid fees," he said.
"We built that so it interacts across every single one of our channels, call centre, branch network, ATMs, mobile app, and desktop. It analyses over 150 billion data points, where there are 200 million machine learning models in play and delivers 1.5 billion next best conversations per annum."
In addition, Pancino said its benefit finder feature in its mobile app is another way the bank is using data "correctly".
"With benefits finder, customers can proactively in our app ask what benefits you're entitled to, and depending on who you are, the data we know about you, we can match rebate to your purse," he said.
"Through the app, you can claim that benefit on the spot. That's adding real value to customers. We have 200 rebate types in product and in the first 12 months we want to give $150m of benefits to our collective customer base and that's about using data correct base by using data correctly."
Similar steps have also been taken by Westpac when it announced in October last year that it introduced a risk management system known as Juno to track compliance issues right across the bank, including logging all of its incident-related data.
Westpac CEO Brian Hartzer said at the time that issues of any sort across the bank goes into that system and gets reviewed regularly -- a much different picture to what it was historically.
"Historically, the management of compliance incidents and the like was dispersed into different business units. While people would report them up and they would get aggregated and shown at various risk committees, it is true that they sat in different systems," he said.
Real-time information-sharing database proposed for Aussie finance watchdogs
The final report of the Banking Royal Commission has determined that a new statutory scheme for the sharing of information between APRA and ASIC is required to more efficiently oversee the conduct of Australia's financial institutions.
CBA slapped with a court-enforceable undertaking after loss of data on 20m customers
Follows inquiries by the OAIC into the bank's handling of personal information in relation to two data incidents.
CommBank to launch new machine learning-backed banking app
Backed by 'world-leading' machine learning, data analytics, and behavioural science.
Australia's open banking regime: Generic product data available from 1 July
Generic product data via APIs kick off the first tranche of Australia's Consumer Data Right.
As part of its complaints handling overhaul, the bank is making use of data such as a customer's postcode and if they're in receipt of Centrelink assistance to find customers that need more 'care'.