Singapore repeatedly has emphasised the need for trust so the adoption of new technology can thrive, but its provision for widening business access to user data -- amidst continuing security breaches and slips -- poses worrying risks ahead. There is urgent need to ensure users have stronger control of their personal data, especially as the government itself will need to restore public trust following a major gaffe involving the country's COVID-19 contact tracing efforts.
Singapore in recent years has been opening up access to citizen data as part of efforts to facilitate business transactions and ease workflow. Just last November, the Personal Data Protection Act (PDPA) was updated to allow local organisations to use consumer data without prior consent for some purposes, such as business improvement and research.
Amongst the key changes is the "exceptions to the consent" requirement, which allows businesses to use, collect, and disclose data for "legitimate purposes", business improvement, and a wider scope of research and development. In addition to existing consent exceptions that include for the purposes of investigations and responding to emergencies, these now include efforts to combat fraud, enhance products and services, and carry out market research to understand potential customer segments.
Businesses also can use data without consent to facilitate research and development (R&D) that may not yet be marked for productisation.
Concerns were raised that the amendments, specifically with regards to exceptions and deemed consent, were too broad and might be abused by organisations. "Legitimate interests", for instance, can be viewed from an organisation's perspective and its assessment subjective when considering whether these interests outweigh potential adverse effects on an individual, which is a requirement outlined in the amendment.
And while individuals still can withdraw consent after the opt-out period, how can they do so when they're not even aware they've been opted in to begin with? Under the "exceptions to consent" rule, are businesses required to inform consumers their data will be used and how it will be used?
Singapore's Communications and Information Minister S. Iswaran has explained that data is a key economic asset in the digital economy as it provides valuable insights that inform businesses and generate efficiencies. It also empowers innovation and enhances products, and will be a critical resource for emerging technologies such as artificial intelligence.
I totally get that, after all, access to data is what powers APIs (application programming interfaces) and fuels market competition.
However, consumers need to be given the ability to decide who and how they want their own data to be accessed because for-profit businesses, when given a free buffet, will inevitably seek to grab as much as they can.
My bank, for instance, is planning to phase out use of its physical token as a two-factor authentication option and transition fully to digital tokens. This means customers like me will be forced to download the bank's mobile app, with which the digital token is integrated, just to authenticate my identity and access any of its online banking services.
The key frustration here is that the bank's app wants a whole host of permissions including the ability to read my contacts details as well as access to my phone's Bluetooth settings and location data.
Any external access to my personal data should be restricted to a need-to-have-only basis. I deem this practice essential in mitigating my security risks, especially as cyber threats are increasingly sophisticated and data breaches seemingly inevitable.
If major companies such as Lazada's RedMart and Grab can overlook security loopholes that resulted in breaches and compromised customers' data, what else are smaller businesses with much more limited resources failing to plug, even as they collect more of consumers' personal information?
Ill thought-out business decisions and security lapses can erode confidence and when consumers no longer trust that their personal data will be protected and used responsibly, they will pull back on adopting new digital services and technologies. And this can have adverse economic as well as social impact.
Singapore should know this best, since public trust took a severe hit when it was revealed the country's COVID-19 contact tracing data could, in fact, be accessed for various purposes other than for its original intent.
The government early this month admitted law enforcers could use the TraceTogether data to aid in their criminal investigations, contradicting previous assertions that contact tracing information would only be be accessed if the user tested positive for the virus.
The revelation triggered much public outcry, with some threatening to circumvent the data collection by deactivating the TraceTogether app, turning off their phone's Bluetooth connection, or placing their device including the TraceTogether token into an RFID-blocking pouch.
Some compromise in personal privacy has been deemed necessary in countries such as Singapore, Taiwan, and South Korea that have turned to technology to aid in contact tracing and movement monitoring, but there are questions citizens should still ask to protect their cyber wellbeing.Read now
Much already has been said about the whole saga so I won't comment on it further, but there are important lessons here for everyone, especially the government.
Topmost, it now must realise large sections of the local population do care enough about their personal data and privacy, and will choose to defend it when they're able to. This should send a strong signal that serious, rather than token (pardon the pun), consideration is needed with regards to how citizens data is treated before policies are rolled out.
There clearly needs to be a mindset change in how the government operates and works on nationwide projects. A multi-ministry taskforce had been set up to deal with the COVID-19 pandemic, with contact tracing efforts often taking centrestage and focus. Yet, months had passed -- since TraceTogether was launched -- without any one of the ministries or even the police, that presumably would be more familiar with the Criminal Procedure Code, raising the alarm that public statements made repeatedly about the use of contact tracing data had failed to consider exceptions to criminal investigations.
At worst, this could be perceived -- even if wrongly -- as a deliberate attempt to deceive the public. At best, it would indicate gross carelessness and lack of communication between the different ministries and government agencies tasked to work on critical national initiatives, such as the COVID-19 pandemic.
The TraceTogether privacy saga further demonstrates the need for users to have stronger ownership of their own data, so they can continuously ask questions about how their personal information is collected, stored, and used, as well as take active steps to safeguard their own cyber hygiene.
Because if they don't, it's clear that businesses as well as governments should not be expected to do so, effectively, on their behalf. What other loopholes and potential security gaps have been overlooked that can potentially lead to serious data breaches down the road?
Such risks can be better mitigated if users were let in on efforts to manage their own data and empowered to decide for themselves whether businesses should, or should not, have access to all or some of their personal data.
In addition, every announcement about new policies that involve access to citizens' data should be accompanied by a security factsheet detailing exactly how access will be protected and data stored and safeguarded. Declarations about the need to secure data should be more than lip service and go beyond brief one or two liners, uttered merely as 'business as usual' attempts to address security concerns.
"People, Process, Technology." Isn't that the basic framework oft cited by businesses and governments as critical to successful adoption? Establishing the processes and technology will mean nothing if users aren't properly equipped to help defend their own data.