X
Business

Australia still working with tech giants on what constitutes abhorrent violent material

Following a discussion with the Attorney-General's Department on Australia's abhorrent video streaming laws, Google was still unsure of its actual obligations.
Written by Asha Barbaschow, Contributor

Australia's abhorrent video streaming legislation was in April rushed through Parliament, requiring hosting and content service providers to notify the Australian Federal Police (AFP) if their platform could be used to access particular violent material that is occurring in the country.

The Criminal Code Amendment (Unlawful Showing of Abhorrent Violent Material) Bill 2019 came in response to the Christchurch terrorist attack, where a video of the attack was viewed around 4,000 times on Facebook and only reported after being live for 29 minutes.

The Attorney-General's Department (AGD) in May held a meeting with relevant parties to discuss the legislation, at the time distributing draft fact sheets.

According to a handful of documents released in response to a freedom of information request, following the discussion with AGD, Google submitted nine questions via email, seeking clarification on information contained within the fact sheets.

While the search engine giant asked for further clarification on what abhorrent material actually is, it also sought confirmation on whether the Act requires companies to breach US or foreign laws.

In response, the AGD said the classification of abhorrent violent material is limited to very specific categories of the most egregious, violent audio-visual material produced by a perpetrator or their accomplice.

"It must stream or record conduct where a person engages in a terrorist act (involving serious physical harm or death of another person), murders or attempts to murder another person, tortures another person, rapes another person, or kidnaps another person," the AGD defined.

See also: Why the tech industry is wrong about Australia's video streaming legislation    

Notification to the AFP of such behaviour must occur whether the provider is based in Australia or overseas. AGD also said the Act does not require foreign companies to act in contravention of foreign laws.

"The Australian Attorney-General's consent is required before commencing a prosecution. This will ensure that any obligations under foreign laws can be taken into account," it wrote.

The provider will be prosecuted if they fail to notify the AFP: If they were aware their service could be used to access particular material; if they had reasonable grounds to believe the material was in fact abhorrent violent of nature; and they knew it was happening in Australia. Prosecution will also occur for failure to remove.

Failure to notify offence can attract a fine of up to 800 penalty units -- currently AU$168,000 for individuals and up to AU$840,000 for corporations. If a corporation fails to remove material, it faces a fine of up to 50,000 penalty units -- currently AU$10.5 million or 10% of annual turnover of the company, whichever is greater.

Individuals face three years behind bars, and a AU$2.1 million penalty, if they fail to remove.

See also: Why Australia is quickly developing a technology-based human rights problem (TechRepublic)

Following the discussion held in May, the fact sheet [PDF] has since been re-issued, along with a flow chart [PDF] that is aimed at helping with determining if material meets reporting requirements.

The Act, as is, does not require providers to take steps to make themselves aware of abhorrent violent material, such as monitoring all content on their platform. But what constitutes a "reasonable time" for AFP notification remains undefined.

In addition, the Australian eSafety Commissioner can issue a notice to formally advise a content service or hosting service provider that their platform is prone to being used to access specified abhorrent violent material. This notice does not mean any offence has been committed, rather, it puts the provider on notice that its service can be used to access abhorrent violent material.

A Google spokesperson told ZDNet it continues to work with the Australian government on the implementation of the laws.

"We have zero tolerance for terrorist content on our platforms," they said. "Over the last few years we have invested heavily in human review teams and smart technology that helps us quickly detect, review, and remove this type of content.

"We continue to work with and provide feedback to the government on the implementation of the Australian abhorrent violent material law."

Google will have two members from the team that reviews and removes violent extremist content in Australia this month and will be meeting with the AGD, as well as the Department of Home Affairs in a separate meeting, to discuss the legislation further.

RELATED COVERAGE

Editorial standards