Young adults born into low socioeconomic families can use an algorithm built on government-collected data to determine whether or not they should bother with their tertiary education, a representative from the Victorian Tertiary Admissions Centre has revealed.
It was alleged at the Data + Privacy Asia Pacific conference in Sydney on Wednesday one unnamed company is looking into use of "very detailed information" about socioeconomic backgrounds, amongst other things, to create platforms whereby young people can go in and plug in their data to receive a determination on the likelihood of their ability to continue at university.
"This group has been working on programs that educators might be appalled at, where kids would type in and it will say to them, 'Well based on your socioeconomic background and your parents history etc you won't do very well at university' ... or your odds are low," the admissions centre representative explained.
A lot of data that is collected by tertiary admissions centres across Australia is mandated to go onto various government departments for a variety of reasons, but when the data is provided to other companies, Dr Simon Longstaff from the Ethics Centre said a discussion on ethical stewardship needs to be had.
"I suspect that government -- in this case as in many -- has acted without any thought of the ethical dimension," Longstaff said. "Most of the time things go wrong in ethics, not because people are wicked ... but because 'everybody is doing it'."
Longstaff said this kind of data use causes havoc in the world and questioned whether the appropriate government entities responsible for the data are taking action and apologising for misunderstanding the company's purpose for collecting data.
Speaking with ZDNet, Anna Johnston, former New South Wales Deputy Privacy Commissioner and director at Salinger Privacy, said this type of data analytics has both a positive and a negative impact. She said questions need to be asked around why the data was collected in the first place, and then what universities -- or individuals -- are going to be doing with that information.
"The right thing to do is to use that information to develop early intervention strategies for the students who need it," Johnston said.
On the flipside, however, a commercially-minded university might say they simply won't enrol the students with the highest risk of failure.
"That's obviously a discriminatory outcome -- an unethical and unlawful outcome -- but that's the potential end-point for this type of analytics," she explained. "You also need to think about the dangers of profiling and the pre-destination of students."
Johnston believes everyone needs the ability to prove the algorithm wrong.
She said even if information gathered on an individual is intended to be used positively, basing it on born-characteristics puts someone in a box before they've had the chance to make their own life decisions.
"If you're trying to develop risk profiles or risk indicators ... you want to calculate their risk of failure based not just on the background they've come from but their actual behaviour once in the institution, so exam scores, attendance rates, then target intervention for students that need it most." she added.
"If a student is told on their very first day of university that they're at risk of failure because they came from the wrong postcode, that can become a devastated self-fulfilling prophecy for that student and instead what you want to do is say, for anyone that is struggling at this, 'here's some resources we've got to help you'."
Data Governance Australia (DGA), the independent body tasked with establishing industry standards around data, launched a draft Code of Practice for public consultation last month, as part of its effort to set industry standards and benchmarks for the responsible collection, use, management, and disclosure of data.
The draft code [PDF] places a heavy focus on doing "no harm" to the customer, highlighting this means organisations would be required to "use best endeavours" to ensure they do not cause harm to an individual as a result of the collection, use, or disclosure of their personal information.
However, as the DGA has no legal power over an organisation, it would rely on the moral and ethical nature of the organisation to enforce the rules its code dictates.