ACMA calls on Facebook, Google, Twitter to develop Australian misinformation code

The regulator wants all digital platforms to work together on the development of a code of conduct to counter misinformation and have it in place by Christmas.
Written by Asha Barbaschow, Contributor

The Australian Communications and Media Authority (ACMA) wants digital platforms to work together on the development of a single, industry-wide code of conduct to counter misinformation, centred on the consumer.

"The ACMA encourages platforms to consider a single, industry-wide code that provides appropriate protections and remedies for Australian users of digital platforms. It expects this code will be consumer centric, readily accessible to the public, and fit-for-purpose for Australia," it wrote in a position paper [PDF] on the creation of a voluntary code or codes of practice on misinformation and news quality on digital platforms.

The ACMA expects the code to address misinformation across all types of news and information, including advertising and sponsored content, that is of a public or semi-public nature, distributed via digital platforms, and has the potential to cause harm.

It also expects the code to cover platforms' considerations of what constitutes quality sources of news and information, and how that is communicated to users.

Must read: Experts renew calls for a government body to tackle foreign disinformation

The ACMA wants all digital platforms, including online search engines, social media platforms, and other major digital content aggregation services with at least one million active monthly Australian users to be covered by the code.

At a minimum, this would cover Facebook, YouTube, Twitter, Google Search, Instagram, Snapchat, TikTok, LinkedIn, Google News, and Apple News, but the ACMA is considering opening that further to virtual assistants and smart home devices such as Amazon Alexa, online forums and other internet communities like Reddit, podcast aggregators such as Spotify, and closed group messaging services, which would include WhatsApp.

"Given the voluntary nature of this process, it will be a matter for individual platforms to decide on whether they participate in the development of the code or choose to be bound by the code," the ACMA wrote.

"The ACMA would, however, strongly encourage all digital platforms with a presence in Australia, regardless of their size, to sign up to an industry-wide code to demonstrate their commitment to addressing misinformation."

It has recommended an "outcomes-based code", which would require "robust" performance reporting from participants.

"This approach also recognises that platforms have a range of existing business models and different measures for addressing misinformation. It may not be appropriate to assess each platform's performance against a single or uniform set of industry-wide performance metrics," the position paper says.

The ACMA expects the code would commit signatories to facilitate research, share relevant data, and undertake associated activities to improve understanding of misinformation in Australia.

"Platforms should consider ongoing avenues of collaboration between signatories, government, academia and other experts, and other relevant industries," it added.

See also: Why technology alone won't save us from fake news (TechRepublic)

The development of the code should be undertaken by the group of platforms soon, with the ACMA wanting it in place by the end of the year.

The ACMA also posited that the code should comprise a "robust, effective, and accessible complaints handling regime", to give users of digital platforms free access to an alternate dispute resolution process.

It's hopeful a representative body would be established to oversee the administration of the code.

The code, according to the ACMA, should boast mechanisms to reduce the impact of potentially harmful misinformation, seeking to protect users of the platform and the broader community from harms caused by misinformation distributed via platforms; empower users to identify the quality of news and information; and strengthen the transparency of, and accountability for, measures to combat misinformation.

On reducing exposure to harmful misinformation, the ACMA said digital platforms could use detection algorithms, employ independent fact-checking services, add human monitoring, flag or remove offending content, provide notification of users sharing offending content, and remove malicious accounts.

It also recommended options across all platforms for users to flag incorrect or harmful misinformation.

Clear labelling of advertisements and sponsored content or a different formatting-type to clearly distinguish ads from news articles and other information would also be beneficial, the ACMA said.

"Digital platforms should not be the arbiters of truth for online information. But they do have a responsibility to tackle misinformation disseminated on their platforms and to assist people to make sound decisions about the credibility of news and information," ACMA chair Nerida O'Loughlin said.

"We know that major platforms have stepped up their processes during the COVID-19 pandemic due to the prevalence of information potentially harmful to health and property.

"It's now time for digital platforms to codify and commit to permanent actions that are systematic, transparent, certain, and accountable for their users in addressing such potentially harmful material."

See also: ASIO boss calls for law enforcement cooperation from tech giants

Minister for Communications, Cyber Safety and the Arts Paul Fletcher expects the digital platforms will work constructively with the ACMA to "set up long-term, transparent and accountable practices to better protect their users".

"Importantly, the code will be developed and implemented to preserve freedom of speech," he said in a statement. 

"Digital platforms will not become general arbiters of truth in our everyday conversations -- however they do have a role to play in protecting Australians from genuinely harmful misinformation. Some platforms understand this responsibility and are already taking ad hoc action on such content. This code will provide the transparency and accountability needed to maintain Australians' confidence that the right balance is being struck. "

The ACMA has been tasked with overseeing the code development process and to prepare a report on the adequacy of digital platforms' measures and the impact of disinformation more generally.

It falls under the work conducted by the Australian Competition and Consumer Commission (ACCC) as part of its Digital Platforms Inquiry, which in July 2019 made a total of 23 recommendations that covered competition, consumer protection, privacy, and media regulatory reform.

One of the recommendations was the development of a code to counter disinformation. At the time, the ACCC said in the event that an acceptable code is not submitted to the regulator within nine months of an announced government decision on this issue, the regulator should introduce a mandatory industry standard.

The consumer watchdog is leading the charge on a mandatory code of conduct to address "bargaining power imbalances" between news media businesses and digital platforms.

The government in April announced it was heading down the path of developing a mandatory code, in lieu of coming to an arrangement with digital platforms, and in May, the ACCC released a concepts paper seeking feedback on how best to address the imbalance in bargaining position between news media and particular digital platforms, posing 59 consultation questions.


Editorial standards