X
Business

eSafety thinks online platforms have done well in removing abhorrent violent content so far

Major online platforms have so far been quick to respond to Australian notices requesting the removal of abhorrent violent material.
Written by Campbell Kwan, Contributor

Under Australia's online content laws, content providers have been required to remove abhorrent violent material from their platforms since mid-2019, otherwise they risk being fined up to 10% of their annual global turnover. The laws, colloquially known as the AVM Act, also expanded the eSafety commissioner's powers, allowing the agency to send notices to online platforms requiring them to remove identified abhorrent violent material.

On Wednesday afternoon, representatives for the Australia's eSafety commissioner said online platforms have generally complied with the laws.

"We've seen fairly rapid responses, certainly on the major platforms. We've been able to get on the phone with our colleagues in the platforms, explain the situation to them, notify them of the contents, and, in many cases, the material is removed within a matter of hours because it's clearly AVM," said eSafety online content manager Alex Ash, who appeared before Australia's Parliamentary Joint Committee on Law Enforcement.

In relation to smaller online platforms, Ash said eSafety has sometimes received responses within an hour after sending out notices, with some content services removing that content overnight by preventing Australian IP addresses from accessing its service. In other instances, responses to notices have taken weeks, but he told the committee that this was due to these services having fewer resources.

"What we found is that there is a very wide degree of resourcing and sophistication differences between platforms that can have real impacts on their capacity to first notice they've been contacted by the Australian regulator, take steps to consider the notice internally, and then take further steps to act on the notice. I think providing some flexibility there, as the legislation does, is a wise feature because it does then turn on whether or not the court decides on its assessment of facts whether or not the expeditious element has been made out or not," Ash told the committee.

The updates were provided to the joint committee, which is currently conducting an inquiry into the effectiveness of the AVM Act.

The eSafety online content manager also clarified that the scope of the AVM Act is confined to only content where malicious intent is explicitly clear when asked by committee deputy chair Anne Aly whether the laws applied to content displaying one-punch attacks.

"I think it'll be difficult for us to infer intent, and necessarily make up murder. We'll always be in the position of the viewer when attempting to reconstruct the events that led up to the creation of a particular item of content," Ash said.

"If you look at the matters that we deal with through the AVM notice scheme, they're very clear partakers of violent terrorism: Where people have been beheaded, the murder of a 17-year old female, the torture of a person involved that was being flayed alive. They're the kinds of matters that fall so clearly and squarely within the scope of the legislation beyond doubt, and that tends to be where we focus our attention."

Earlier on Wednesday, Digital Rights Watch defended social media platforms and their efforts to remove online abhorrent violent material, telling the committee that companies should not be expected to be always aware of all of this content at all times.

"I'm not sure what we gain by doing the sort of pressure on companies to wholesale remove all this content all the time," Digital Rights Watch programme and partnership director Lucie Krahulcova said.

She warned that laying excessive penalties against companies for having violent material on their social media platforms runs the risk of several decades worth of footage from activists and journalists being covered up to avoid regulatory backlash.

RELATED COVERAGE

Editorial standards