M2M makes sense for DBS Bank

[CASE STUDY] Singapore bank is expanding its machine-to-machine infrastructure to capture customer interactions on mobile devices and social media in real-time, which it hopes will improve service quality and customer experience.
Written by Kevin Kwang, Contributor on

SINGAPORE--For DBS Bank, machine-to-machine (M2M) communications has been a mainstay in its IT systems. A signal from the automated teller machine (ATM) to the bank's core system seeking permission to dispense cash to a customer from the person's account is one basic example of how M2M is core to the local bank's daily operations.
These days, though, the bank is moving beyond merely communicating transactional data and expanding into collecting unstructured data from customers' mobile devices and social media accounts. This move will help the bank better understand and meet their customers' needs, said David Gledhill, managing director and head of group technology & operations at DBS Bank.

DBS Bank case study

What: Expanding machine-to-machine capabilities from transactional data to collecting and mining unstructured data from customers' mobile devices and social media accounts
How: Building a separate data repository for unstructured data and implementing analytics software from various vendors, including Teradata and Progress Software.
Cost: "In terms of a percentage of our investment dollars, it's starting to become a significant amount," Gledhill said.
Results: Reduced costs, optimized its network, and improved customer experience

In an interview with ZDNet Asia, Gledhill said the number of mobile devices used to interact with the bank has "exploded". People used to come to the branch or visit an ATM for their banking needs, but with the Internet, they now use their mobile devices and social media accounts to engage with the bank too, he noted.

"The prize now is to understand what's going on across all of these touchpoints and, [using the collected data], help us improve our services, get to know the customer better, and allow us to sell our products in a timely way," he said.  
Building on existing systems
The executive said DBS' basic core transaction remains mostly the same, but the "bulk of its investment" has been to create a separate repository for unstructured data collected as well as how to store it and analyze the information.
Gledhill revealed the bank was heavily invested in another data warehousing project using Teradata when he joined the bank in 2008, which was why it did not create something new to expand its M2M capabilities but chose to build on top of the existing infrastructure.
Currently, Teradata is the core engine for its analytics library but the bank is also actively looking at other offerings from vendors such as IBM's Netezza and EMC's Greenplum appliances to improve its capabilities, he said. Progress Software provides the real-time, event-driven engine which plugs into DBS' existing TIBCO middleware layer, he added.
"We've got a lot of the building blocks in place today, and we're [now] trialing out a number of different things," the IT head said.
Asked how much the bank has spent to set these systems in place, Gledhill declined to give specific numbers since those are "proprietary". "What I will say is in terms of a percentage of our investment dollars, it's starting to become a significant amount. It's real, it's meaningful, and we believe there's value to be had."
Making sense of the data
The "tricky part", however, is in figuring out what the bank should react to and how, as it runs the risk of over-reacting and scaring the customers with how much it knows about them.
"We're stepping somewhat cautiously into [analyzing customer behavior]. We don't want any of the Big Brother-type things happening when the customer says, 'Wow, how did they know that?'" Gledhill said.
He acknowledged that any customer transaction information is very sensitive, and the bank does restrict access to such data to only a certain group of people. To work around this, it looks at customer trends and other indicative sources of information such as location data.
"If a customer uses an ATM in a shopping center, and we know who they are--their sentiments and preference--it's very easy to send them an offer to a merchant in that place and do so in real time," he said.
However, the bank does wrestle with questions such as how often should it send such offers to customers, what kind of offers should it send, and how does it track customer response. "A lot of those are still unanswered questions and we'll learn as we go about how much is too much in terms of intrusiveness," Gledhill stated.
One positive example of how it has been using M2M communications to improve its service quality is by "listening" to what all of its ATMs are saying in order to reduce maintenance costs and ensure customers will always have access to their funds with minimum inconvenience, he pointed out.
The IT chief said it has a schedule of planned downtimes for its ATMs to refill them with cash and provide maintenance, but such downtimes are "very expensive" for the bank and inconvenient for the customer if the ATM runs out of cash.
Now, every single transaction from every ATM gets sent to its Teradata warehouse and it runs advanced forecasting to predict and analyze when a machine will run out 7 days in advance. The bank also has a whole histogram of when to replace these ATMs, he added.
"The project cost the bank a couple of million dollars to build, and payback was 18 months. We've reduced cashouts--that is the amount of amount of times the machine is out of money--by 80 percent. We've reduced customer complaints dramatically. The whole experience for the customer is much better.
"So it has saved me cost. It has optimized our network. It has improved customer experience. It was an experiment when we started, but it has really, really proven to be very effective," stated Gledhill.

Editorial standards