Social media default privacy settings for children would need to be cranked to highest under Australian social media probe calls

The social media probe found that the ultimate burden of providing online safety for Australians should be placed on social media companies.
Written by Campbell Kwan, Contributor
Image: Getty Images

Social media companies would be required to set the default privacy settings for accounts owned by children to the highest levels and all digital devices sold in Australia would have to contain optional parental control functionalities, under new recommendations made by the federal parliament's social media and online safety committee, which published the findings of its social media probe in a report last night.

The findings follow months of public consultation where the committee heard from online abuse victims, social media companies, and Facebook whistleblower Frances Haugen among others.

The bipartisan committee provided these recommendations, along with 24 others, as it found social media companies could be doing more to protect the privacy of young people online.

In the report, committee chair and Liberal MP Lucy Wicks said the balance of responsibility should be shifted to place "the ultimate burden of providing safety" on social media companies.

"For too long social media platforms have been able to 'set the rules', enabling the proliferation of online abuse on their spaces," Wicks said.

As part of placing more responsibilities on social media companies to reduce online harm, the committee also recommended for the eSafety commissioner to both conduct a review of the use of algorithms in digital platforms and require these companies to report on their use of algorithms.

These reports would detail evidence of harm reduction tools and techniques to address online harm caused by algorithms, the committee wrote in the report.

"More transparency is required of social media companies to demonstrate that these concerns are being addressed," the committee wrote in the report.

"Developing an effective transparency framework for the social media platforms is a complex challenge for policy makers and regulators, but an important one to address. In many senses it is the policy intervention that the success of many other interventions rests upon."

More regulation focusing on transparency could be on the way too, with the committee repeatedly labelling it as a complex policy issue that needed to be addressed. In calling for further regulation, it recommended for various agencies to examine "the need for potential regulation of end-to-end encryption technology in the context of harm prevention"

The committee did not just lay blame on social media companies for the toxic behaviour that exists online, however, with its probe finding both government and society at large can do better to prevent online harm as well.

It also said the federal government should "significantly increase funding" to support victims of technology-facilitated abuse through existing government-funded programs. Examples of this support are additional funding for specialised counselling and support services.

In addition to the aforementioned algorithm review, the committee also recommended a slew of other reviews to be established, including a digital review focusing on the need and possible models for a single regulatory framework under the Online Safety Act. During the committee's probe, it heard from witnesses that the federal government's various new proposed online laws, ranging from the anti-trolling Bill to the online privacy Bill, should fall under one regulatory framework.

The committee has also called for the federal government to prop up another parliamentary committee that will be tasked with looking into internet, online safety, and technological matters after the federal election.

For future reviews of the recently passed Online Safety Act, the committee recommended for them to consider the implementation of Safety by Design Principles on major digital platforms, including social media services and long-standing platforms which require retrospective application of the Safety by Design Principles.

The committee also wants the country's eSafety commissioner to receive more funding moving forward, with those extra resources to be put towards education and awareness campaigns, examining the extent to which social media companies actively prevent online harms, creating a national strategy for children online safety education, and drafting new industry codes and implementation of the Basic Online Safety Expectations.

The committee's recommendations will now go to the federal government, which is looking to push the passage of its proposed anti-trolling laws before the federal election.

Labor Senator Kim Carr last week noted, however, that the Bill is unlikely to pass before the federal election as there is only expected to be three sitting days before the election is called.

The Senate referred the anti-trolling law for inquiry last month, with a report of its findings expected next week.

Related Coverage

Editorial standards