X
Tech

Researchers label Australian data-sharing legislation a 'significant misalignment'

The proposed legislation has been called out for prioritising the perceived greater good instead of respecting minimal rights of the individual.
Written by Asha Barbaschow, Contributor

A team of researchers from the University of Melbourne has called the government's proposed data-sharing legislation a "significant misalignment", concerned that by not acknowledging and protecting the fundamental right to privacy of individuals, there will be little chance of establishing trust in the use of public data.

The remarks come via a submission [PDF] to the Issues Paper discussing the Australian government's Data Sharing and Release Legislation, which opened for consultation in July.

They said currently the purposes laid out under the "Purpose Test" for data-sharing are prioritising the perceived greater good instead of respecting minimal rights of the individual.

"'Consent' does not make one appearance in the proposal, while being a central tenet to privacy best practice, is indicative of significant misalignment," they wrote.

The submission was penned by Dr Chris Culnane, acting Professor Benjamin Rubinstein, and Dr Vanessa Teague -- the same research team that re-identified the Medicare Benefits Schedule and Pharmaceutical Benefits Scheme data in September 2016 and reported in December further information such as medical billing records of approximately 2.9 million Australians were potentially re-identifiable in the same dataset.

Their submission explains that neither data releases nor legislation exist in isolation, and suggests the government consider improving privacy legislation by adopting provisions from the GDPR, as one example.

See also: How Europe's GDPR will affect Australian organisations

"Anything else is likely to be counterproductive by weakening already inadequate protections and rights of citizens and consumers," the submission says.

The researchers are concerned that the focus of the government in shaping its legislation is on the entity that collected the data, suggesting instead the concern should be the type and nature of the data that is collected.

"Any entity that collects sensitive personal data, for example, medical data, should be excluded from the default sharing and release arrangements," they continued. "Such data should only be shared when those for whom the data relates have provided consent. Such consent should be revocable at any time, for example, by adopting a dynamic consent approach."

The federal government is hoping to reform the Australian data system, in May announcing it would be investing AU$65 million on initiatives such as the country's new Consumer Data Right (CDR), which will allow individuals to "own" their data by granting them open access to their banking, energy, phone, and internet transactions, as well as the right to control who can have it and who can use it.

In order to implement the changes, it needs to overhaul its legislation, at the time labelling existing public service data use arrangements as complex and hindering the use of data.

The Issues Paper asked firstly if the considerations it was giving were sufficient to shape the legislation. The team of researchers from the university said no, noting that not all factors for guiding important legislative development have been taken into account in the proposal.

"Privacy and data protection are given inadequate consideration by the proposal in its present form," they wrote.

Another question asked by the Commonwealth was if existing security provisions should prevail.

"To override existing secrecy provisions risks serious erosion of public trust, and would likely have a long term negative impact on trust in data use and the role of government more broadly," the researchers wrote.

"The current public discussion around My Health Record data does not indicate that it would be popular to override the (already limited) protection in that act. We highlight that the proposal seeks to honour commercial agreements -- stating that 'Existing contractual obligations, including around purchased data sets, will continue to apply' -- yet proposes breaking the legislative secrecy obligations that apply to the public."

Must read: Privacy advocates have failed to engage on My Health Record

The researchers called out the proposed legislation for failing to be succinct. They also say that not all data-sharing is beneficial, noting the sharing of data may have significant risks for individual or group harm if data is exposed.

The team also highlighted the importance of distinguishing genuine scientific research in the public interest from commercial research for financial gain.

Where data safeguards are concerned, the Commonwealth asked respondents if the Five-Safes framework was an appropriate mechanism; however as highlighted by the researchers, the framework actually provides no protection, noting instead it is solely a decision-making framework to plan and evaluate a prospective release.

"Since the Five-Safes framework does not provide any actual protection, but merely provides a framework to facilitate the thought process, it is difficult to think of any instances where it could not be applied," they wrote.

"What is more concerning is the lack of appreciation of what Five-Safes actually provides: It is the techniques applied to achieve the notional 'safety' that are critical, not the framework itself."

When it comes to accessing data, the researchers said that it should only occur in secure environments that are both physically and digitally secure, saying such facilities should be air-gapped and offline.

Discussing the barriers to data sharing and release, as the government called them in its Issues Paper, the submission notes that where such barriers do exist, they are often there for a good reason -- for the protection of the privacy of personal data.

As a result, the team want to see the establishment of a public register that details who and why recipients of public data have received certain datasets.

The researchers also want to see the introduction of strict penalties for the misuse of data.

"The suggestion that there should be immunity for good faith indicates a failure to understand the serious consequences for an individual if their data is exposed," they wrote.

"There may be very serious harms including discrimination, exclusion from credit, or even exposure to family violence. It is inadequate to say that there shouldn't be a penalty as long as these harms were caused 'in good faith'.

"No such immunity should exist. A lack of severe consequences will breed complacency in the handling of data."

SEE ALSO

Editorial standards