X
Tech

TikTok touts vigilance to prevent further distressing videos from appearing on its app

The controversial video-sharing platform also said it would happily notify Australian authorities of any misinformation takedowns, if it knew who to go to.
Written by Asha Barbaschow, Contributor

In late August, a video of a man dying by suicide was posted on Facebook. The graphic video spread across other platforms such as Instagram, Twitter, and Youtube, but it continued to appear on TikTok weeks later as the app struggled to remove the horrific content.

TikTok recently faced the House of Commons to explain how this happened, blaming "bad actors". On Friday, Australia's Select Committee on Foreign Interference Through Social Media continued this line of questioning by talking with the controversial app's local general manager and global chief security officer.

According to TikTok Australia and New Zealand general manager Lee Hunter, the live-stream was taken down, but copies had popped up faster than they could be detected.

"When we had our technology look at that video, we immediately took it down. But when my colleague in the UK was discussing this idea of these bad actors, unfortunately over the course of a week, we saw some 10,000 variations of that video trying to be uploaded to the TikTok platform," he explained.

How exactly these videos circumvented the platform's checks, the GM said he'd prefer not to say, explaining that doing so would highlight some of the methods that were adopted to evade detection. He did say, however, that as soon as TikTok knew of the video and encountered it, the company began "acting swiftly and aggressively" to take it down.

"It wasn't just a case of copies of the video, it was a case that -- I won't go into too much detail here because I don't want to provide any fuel for people to follow -- but splicing the content within other content so it seems innocuous at first and then you encounter it."

Tasking its systems to better detect such content, Hunter said the platform's moderation teams around the world were all focused on addressing what had happened.

"Where we stand now, to the best of my knowledge, is that the content is not up on the TikTok platform. That's not to say that the vigilance stops," he said.

TikTok recently wrote to the heads of some of its peers, including Google, Facebook, Twitter, Pinterest, and Reddit, proposing a memorandum of understanding to enable the group of social media companies to share information to better protect against such content being made available on their respective sites.

"We can be better armed to prevent it from happening across a variety of platforms," Hunter said. "That type of collaboration across our peers we see is key. We all have the same goals of protecting our users. We all have the same goals and making sure that this content isn't up on our platforms."

As the committee is focusing on how social media plays a part in potentially harming Australia's democracy, the TikTok representatives were also asked if the platform was able to be "undermined and infiltrated by bots and bad actors". In addition, the committee asked representatives how TikTok could be certain that it has political influence under control.

"If you can't even protect kids from seeing suicide videos how on Earth are you going to protect the Australian voters from political interference," Greens Senator Sarah Hanson-Young asked.

"Unfortunately, with user-generated content platforms, there are attempts by these bad actors to upload distressing content of this nature. Now, the key isn't to pretend it doesn't exist but to act swiftly and to invest in technologies and people and moderation policies to enable it not to appear on the platforms to protect our users," Hunter said in response.

He said in regards to foreign interference, TikTok employs technology and moderation teams to help it understand when it does encounter coordinated inauthentic behaviour, and "looks to act swiftly upon that".

"The idea of misinformation and disinformation runs counter to our community guidelines and it's something we don't tolerate on the platform. We view this vigilance as ongoing and evolving and something you never stop trying to get better at," he said.

The TikTok representatives took on notice how many Australian accounts saw the suicide video and what the equivalent remuneration for TikTok's local operations was over the period of when the video was viral.

Earlier in his testimony, Hunter said the company's Australian operations wanted to collaborate with government as much as possible to protect Australian users. But his company was not clear on who it would contact if it were to remove coordinated inauthentic behaviour from Australian users.

Hunter was also asked if there was a requirement for his company to report such behaviour.

"I'm not aware of any requirement," he said.

The company's local director of public policy Brent Thomas stepped in to say he expected TikTok would report to "some combination of DFAT, Department of Defence, and Department of Communications" but admitted that no request has been made of the video-sharing platform, nor any clear instruction about who to notify, and under what circumstances.

Prime Minister Scott Morrison in August said that he had a "good look" at TikTok and there was no evidence to suggest the misuse of any person's data.

"We have had a look, a good look at this, and there is no evidence for us to suggest, having done that, that there is any misuse of any people's data that has occurred, at least from an Australian perspective, in relation to these applications," he told the Aspen Security Forum.

"You know, there's plenty of things that are on TikTok which are embarrassing enough in public. So that's sort of a social media device."

Hunter and Thomas were both unaware the Australian government had taken any steps to review TikTok's operations down under. They said they were not contacted by the Department of Home Affairs to provide any information or verify any concerns.

"I think it is quite incredible that a government department would undertake a security review of an organisation not requesting any information or input from them at all," committee chair Jenny McAllister said.

Thomas said while TikTok was aware a review was being undertaken, the first time it was made aware of the outcome was when it saw the public comments from Morrison.

IF YOU OR ANYONE YOU KNOW IN AUSTRALIA NEEDS HELP CONTACT ONE OF THESE SERVICES:

  • Suicide Call Back Service on 1300 659 467
  • Lifeline on 13 11 14
  • Kids Helpline on 1800 551 800
  • MensLine Australia on 1300 789 978
  • Beyond Blue on 1300 22 46 36
  • Headspace on 1800 650 890
  • QLife on 1800 184 527

MORE FROM TIKTOK

Editorial standards