Australian government's recklessness with medical data is symptom of deeper problems

The revelation that supposedly anonymous medical data can be re-identified tops off a year of data governance incompetence by the Australian government. But will there even be a response, let alone a fix?

If you work with data -- analysing it, I mean, not just shoving it along the pipes -- then it should come as no surprise that researchers at the University of Melbourne managed to re-identify supposedly anonymous medical data.

Nor should it be a surprise that the official government response was to downplay the risk.

"The Department of Health takes this matter very seriously," began the message that a departmental spokesperson sent ZDNet on Monday, echoing every corporate mea culpa ever.

The department had referred the problems with this health dataset to the privacy commissioner a year ago, and now says that it has taken unspecified "further steps to protect and manage data".

"The department has not been aware of anyone being identified," they finished, as if that somehow excuses them. After all, the dataset is out there in the wild, having presumably been downloaded at least once before being taken offline.

Imagine this: We take child safety very seriously. It is regrettable that there were no fire extinguishers in the kindergartens. We are not aware of anyone having been burnt thumbsup.gif.

This isn't the first time that the Australian government has glossed over data governance issues, of course.

The flagrant misuse of clearly tainted Centrelink data has caused distress to thousands of vulnerable citizens who were falsely told they owed the government money, while the responsible minister brushed off the huge false positive rate as acceptable collateral damage.

Meanwhile, the vacuum cleaner of surveillance for ill-defined "national security" purposes continues to grow unabated and with almost no public debate.

Matt Tait, who tweets as @pwnallthethings, once worked at Britain's Government Communications Headquarters (GCHQ), their equivalent to the US National Security Agency (NSA). Earlier this month he pointed out the massive difference in attitude between private industry, running what some commentators are calling "surveillance capitalism", and the Five Eyes spookland.

"I started my career at GCHQ (& v close w/ NSA), then worked with US tech cos. Was a huge culture shock for me to go from privacy being at the center or backdrop of nearly every conversations to an industry that is cavalier -- reckless even -- in its casual disregard for user privacy," Tait tweeted.

"We used to have conversations about privacy *constantly* at GCHQ. In industry, devs would casually watch celebrities using their systems, push updates that turned on your laptop microphone without warnings, or intentionally overcollect 'just in case we need it later'," he added.

For mine, the Australian government's attitude seems much, much more like the surveillance capitalism model than the seemingly responsible attitudes of the spooks. I'll get pushback from privacy advocates for that comment, and yes, spookland isn't a honey-dripping wonderland of delight. But at least here in the Five Eyes nations there's a set of rules that spookland tries to follow, at least most of the time.

The Australian government seems to be forgetting who it's meant to be serving.

I've written previously about its inside-out digital health strategy, for example. There's talk of putting the patient at the centre, but it doesn't look like it. It looks more like it's designed to create a data product to sell, and a "working" system to deflect any charges that it's a multi-billion dollar boondoggle that'll do little to improve our health.

Why do we start building these systems without thinking through the implications?

Constellation Research vice president Steve Wilson recently brought attention to a line from an episode of ABC Radio's Future Tense. Lachlan McCalman, senior research engineer at CSIRO's Data61, was talking about building ethical decision-making into algorithms.

"I want to say that this is still a new area of research, there's a lot of unanswered questions. And ... even just writing down what it means to behave ethically is an unsolved problem that we've been dealing with for as long as we've had philosophers and thinkers," McCalman said.

"But at the end of the day we do have to build these systems, and so we came up with a practical set of guidelines to think about them."

No, mate. No we don't have to build these systems. We don't have to build anything.

"This is depressing," Wilson tweeted.

Yes, very depressing.

Such build-it-because-we-must zealotry is almost understandable in a commercial organisation. Almost. But in government, it's inexcusable.

Related Coverage

Clear-cut definition of de-identified data critical in legislation: Pilgrim

Australia's Privacy Commissioner has said the de-identification of data is an area requiring regulation, and that agreed industry standards could be useful to fill the public with confidence.

Australian government unveils open data framework for cities

The government has launched a database with innovation, digital opportunities, governance, infrastructure, investment, sustainability, jobs, skills, and housing information on the nation's most populous cities and regions.

Brandis to criminalise re-identifying anonymous data under Privacy Act

The Australian government will introduce amendments to the Privacy Act to criminalise the re-identification of de-identified data, with the law to take effect from Wednesday.

Review asks for tighter Medicare card privacy controls from Human Services

Moving the authentication platform, educating citizens, and stricter privacy controls were among the steps recommended to the Department of Human Services by a review into heath providers' access to the Health Professional Online Services system.

Senate committee to probe how personal Medicare details appeared on dark web

Committee is to report by October 16 on how Medicare details appeared on the dark web.