Canberra 'underwhelmed' with Facebook's live-streaming defence

After allowing videos of the Christchurch terrorist attack to be streamed, the Australian government has told social media giants they are not above the law, drafting new legislation that will apply criminal penalties to those that allow such activity on their platforms.
Written by Asha Barbaschow, Contributor

The federal government is drafting new laws that seek to apply criminal penalties to social media platforms that allow videos containing serious offences to be streamed.

The new legislation comes in the wake of the Christchurch terror attack, and is in response to the lengthy time it took for Facebook specifically to remove the live-stream of terrorist activity.

Australian Attorney-General Christian Porter said this was really the first instance where a "terrorist used a social media platform as a specific tool of their terrorist event to spread hatred, violence, and terror, and do it in a way where the social media platforms seem to have so little control or interest in controlling their content".

The government met with social media companies that have a presence in Australia on Tuesday, giving the likes of Facebook an opportunity to prove that they can act in the best interests of its users without the need for legislation.

However, following the meeting, Porter said efforts to "dissuade or discourage" the government from the view that legislation is needed to deal with the problem of live-streaming serious criminal offenses was not sufficient.

"It is impossible to understate the consequence and significance of that particular issue of a social media platform being used by a terrorist perpetrator to spread terror, to spread violence, spread their crazed and fanatical message, and this was an opportunity to dissuade the government from a view that legislation may be needed to deal with that emergent issue," he told reporters following the meeting.

"And I must say that as an effort to discourage us from that view it was thoroughly underwhelming."

While he conceded that different platforms may take different approaches to try and prevent the live-streaming of serious criminal offenses, Porter said there was "unfortunately nothing in that room that would discourage the government from looking at a legislative solution to try and ensure that much quicker action is taken when a live-stream involves the relaying of serious criminal offending".

"The time that it took Facebook to act with respect to the Christchurch events, was totally unreasonable," he added.

"The fact that a company is based in San Francisco or Russia or wherever it might be, and it allows in a totally unreasonable way the live-streaming of an incredibly serious criminal offence, the fact that it might be difficult to police shouldn't stop the government from trying to make that unlawful because it's totally unacceptable ."

Prior to the meeting, Prime Minister Scott Morrison said he was looking for the social media giants to come to the table as "responsible corporate citizens", and to essentially give a guarantee that their products are safe in Australia and do not pose a risk to national security.

"We want the same rules to apply in the online social media world that exist in the physical world," he said, discussing how a vehicle manufacturer needs to follow strict guidelines before an Australian can get behind the wheel of a car.

"Building and making it safe means you can't let a terrorist atrocity be filmed and up and posted and streamed and be online for 69 minutes -- 69 minutes -- that's not acceptable, that has to change.

"They can get an ad to you in half a second; they should be able to pull down this sort of terrorist material and other types of very dangerous material in the same sort of time frame."

Facebook last week published figures on the views and shares of the live-streamed footage, with Facebook VP and Deputy General Counsel Chris Sonderby saying in a blog post the original video uploaded by the shooter was viewed less than 200 times during the live broadcast.

However, before being removed, the original video was viewed roughly 4,000 times in total. It took 29 minutes before a viewer finally reported the video.

Facebook said that after deleting the footage, the company "hashed it so that other shares that are visually similar to that video are then detected and automatically removed from Facebook and Instagram". It also used audio detection technology to automatically find copies of the viral footage.

Despite these efforts, roughly 1.5 million copies of the video sprung up on the network in the first 24 hours after the attack. However, only approximately 300,000 copies were published as over 1.2 million videos were blocked at upload, Facebook said.


Editorial standards