X
Tech

Why the tech industry is wrong about Australia's video streaming legislation

The tech industry's predictable knee-jerk reaction to the government's knee-jerk 'social media' legislation highlights its moral bankruptcy. It's time to be part of the solution, not the problem.
Written by Stilgherrian , Contributor

Let's get something out of the way up front. The way the Criminal Code Amendment (Unlawful Showing of Abhorrent Violent Material) Act 2019 [PDF], often described as "social media" laws, was rammed through parliament on Wednesday night was a disgrace.

A dying government in its final days needed to be seen to be doing something about the Christchurch terrorist attack being live-streamed and replicated on Facebook and elsewhere. So it conjured up some legislation and made it law without any chance for debate, let alone public consultation.

That's appalling, and this unseemly haste probably means the new laws are riddled with holes.

It's worth noting, too, that the Labor opposition could have chosen to vote against these laws, but didn't. Just like they did with the controversial Assistance and Access Act in December. Enough said.

But set that aside, and look at the intent of the legislation as outlined in the explanatory memorandum [PDF]. What it demands is really just an extension of what companies should already be doing for other kinds of material, with added time pressure, and criminal penalties for the tardy.

The new laws apply to persons -- so individuals and organisations -- that are a "content service" or "hosting service". Those terms have much the same meaning as in the Enhancing Online Safety Act 2015, with some cut-outs.

That Act covers things like bullying material and so-called "revenge porn", and there is already a take-down regime for that kind of thing.

Then there's the small matter of child abuse material. There are criminal laws in every Australian jurisdiction that provide heavy penalties for the dissemination and/or possession of child pornography.

Every organisation is already at risk of criminal prosecution if they're failing to remove child abuse material from their systems.

Finally, Schedule 5 of the Broadcasting Service Act 1992 sets out how "online services" as defined in that Act must deal with "prohibited content or potential prohibited content".

In other words, online operators of various kinds are already required to deal with a wide range of content matters. They should already have processes and procedures in place for handling them.

The new Act extends that obligation to deal with "abhorrent violent material", but that's defined in quite a narrow fashion.

The material, whether live streamed or posted after the fact, has to be produced by the actual perpetrators of the "abhorrent violent conduct" or their accomplices (section 474.31(1)(c)), and that's defined as depicting an actual terrorist act, or a murder or attempted murder, or a torture, rape, or kidnapping (section 474.32).

The service provider has several obligations:

  • When they become aware that their service can be used to access what they have "reasonable grounds to believe is abhorrent violent material" relating to conduct in Australia, they have to notify the Australian Federal Police (AFP) in a "reasonable time" -- unless they "reasonably believe" that the AFP already knows about it.
  • If they're hosting the material, they must "expeditiously cease hosting the material", and they're at fault if their "recklessness" means they fail to do that.
  • If they're a hosting service and receive a notice from the eSafety Commissioner that they're hosting abhorrent violent material, they could have to show why they weren't being "reckless" in hosting this material.

In all of this, it doesn't matter whether the service provider is based in Australia or not. That's a common misconception. It's about whether the service is accessible in Australia.

Another common misconception is that the new laws somehow restrict journalism or the investigation of human rights abuses.

The Act (section 474.37) lists a whole range of defences. They include when "accessibility of the material is necessary for enforcing a law" in Australia or internationally, or for monitoring compliance with the law; for the purposes of proceedings in a court or tribunal; when it's reasonably needed "for, or of assistance in, conducting scientific, medical, academic or historical research"; for news or current affairs reporting in the public interest; for "the development, performance, exhibition or distribution, in good faith, of an artistic work"; and more.

It's not often that one gets to praise an Australian Attorney-General. However it must be said that Australia's current favourite Attorney-General, at least for the time being, Christian Porter performed admirably on Melbourne radio 3AW on Thursday.

After pointing out how the Act does not apply to a range of scenarios suggested by the presenter, Porter clearly stated the laws' intent.

The problem with Facebook and Twitter is that this material goes up; in the case of the Christchurch material, it live-streams for 17 minutes. At the 29-minute mark, there's a complaint, and they do precisely nothing to remove it from their hosting service until after the New Zealand Police formally called them to tell them that it's on their site.

Now, I can't tell you precisely at what point in time it was reasonable for Facebook to act -- a jury would have to make that decision -- or when they were reckless as to the point that the material was on their website. But what I can say is it's totally unreasonable that this goes on for well over an hour and the rest of the world knows about it.

We very much hope that this will change behaviour of the major social media platforms with respect to their content.

For mine, this is not an unreasonable objective. But some of the tech industry's reactions sound as if they themselves were the victims of abhorrent violent conduct.

Without picking on Atlassian co-founder Scott Farquhar specifically, his tweets have been widely quoted and they're classics of the genre. He's also seen as an industry elder. So I'll use them as an example.

Farquhar says no one wants abhorrent material on the internet, but has not explained how this aim might be achieved. It's just more of the same industry-standard shouting about job losses that everyone has heard before.

"If the material in question is uploaded and you don't take it down 'expeditiously', you can go to jail. What is expeditiously? Not defined! 'Who' in a company? Not defined!" he tweeted.

Well, as the explanatory memorandum says:

"Expeditious" is not defined and would be determined by the trier of fact taking account of all of the circumstances in each case. A number of factors and circumstances could indicate whether a person had ensured the expeditious removal of the material. For example, the type and volume of the abhorrent violent material, or the capabilities of and resourcing available to the provider may be relevant factors.

Technology-neutral and context-neutral laws, with the specifics of "reasonable" and "expeditious" backed by case law that takes into account the individual circumstances of the case, are normal.

I imagine that a hard-coded time frame, or a specific process defined by the lawmakers, would be even more objectionable to some in the tech industry.

And "Who?" in the company will be responsible? Surely that's up to each company to organise its own chain of command.

To be clear, no one is suggesting for content platforms to have humans moderating all content live. Ignore that straw man. But if platforms can't respond expeditiously it's because they're choosing not to.

If an online service can choose to build a process whereby an engineer can respond to a technical alert within minutes, then they can choose to build a process whereby a content moderator can respond to a content alert within minutes.

It'd be lovely if all the clever people could think about how they might do that, and how they might generate those alerts, instead of complaining that it's all too hard.

Let me repeat two key points for the hard-of-thinking. The process by which this legislation came into being is appalling. Given the speed of its passage, it's bound to be poorly drafted.

But if the tech industry is going to react to every bit of regulation with the hyperbolic narrative that job losses destroy the industry, its opinions will soon be ignored. And rightly so.

If you don't want this material on the internet, what are you going to do about it?

The industry needs to choose to be part of the solution, not part of the problem.

RELATED COVERAGE

Editorial standards