X
Tech

Meta removes Facebook content after notification to do so under Singapore online safety law

A Facebook page containing child sexual exploitation content has been removed, after the social media platform was notified to do so under a Singapore legislation that kicked in just this year.
Written by Eileen Yu, Senior Contributing Editor
Meta logo in background with Facebook logo on phone
NurPhoto/Contributor/Getty Images

Meta has removed a Facebook page containing content deemed egregious, after it was notified to do so under a Singapore legislation that kicked in just this year. 

It marks the first time a social media platform has done so since the Online Safety Act took effect on Feb. 1.

Also: US Surgeon General releases social media health advisory for American teens and tweens  

Singapore's Infocomm Media Development Authority (IMDA) said in a statement Friday it had "notified" Meta to review and remove a Facebook page and group containing child sexual exploitation material. IMDA later clarified that the social media platform had done so when the statement was released. 

If it had not removed the content after receiving the notification, Facebook then could have been issued a formal directive from the authority to do so and would have to comply within 24 hours, as required under the Act.

Internet service providers in Singapore, though, were formally instructed to block a website linked to the Facebook page and group, according to IMDA. 

The industry regulator said the Singapore Police Force alerted it to the Facebook page, which was part of an online network facilitating the sharing of child sexual exploitation content. It led to the discovery of the Facebook group, which contained similar posts with hyperlinks leading viewers to a website carrying such content.

Also: The best parental control apps to keep your kids safe

Apart from child sexual exploitation material, others deemed egregious under the Act include content advocating or instructing on physical violence and terrorism, as well as content that poses public health risks in Singapore.

Under the online safety legislation, which was included as amendments in the country's Broadcasting Act, IMDA has the authority to direct social media platforms to block or remove egregious content. The regulator said it would not hesitate to do so if these services failed to "swiftly" detect and remove such content on their platforms.

It said these operators should remain vigilant in identifying and preventing the dissemination of harmful online content through their services and platforms. 

"Tackling the threat of harmful online content is a global issue that requires a whole-of-society effort," IMDA said. "The Singapore government has strengthened our regulatory framework and will continue its efforts in ensuring regulatory and public education measures can address the growing range of harmful online content and protect Singapore users against online harms."

Also: Instagram feed fix: How to see more of what you want (and less of what you don't)

Under the law, failure to comply with directives may result in a fine of up to SG$1 million. The platform's service also may be blocked in Singapore. 

In explaining the need for the Online Safety Act, the government previously said that while such websites had made efforts to address the issue, online harms continued to prevail and were compounded when amplified on social media. It also noted that other governments worldwide were reviewing ways to effectively regulate social media services.

Facebook in 2019 mandated that advertisers running campaigns on social issues, elections, and politics on its platform in Singapore had to confirm their identity and location, as well as reveal personnel responsible for the ads. This was part of efforts to stem the spread of misinformation and help block foreign interference in local elections, it said. Singapore held its general elections the following year. 

Also: Former ByteDance exec says China can access user data, even when it's stored on US soil

In the lead-up to the 2020 elections, the government had issued several directives under the country's Protection from Online Falsehoods and Manipulation Act (POFMA), including an order for a Facebook page to be tagged as one with "a history of communicating falsehoods" and blocked for repeatedly refusing to comply with directives under the Act. Both Facebook and Twitter also were instructed to carry correction notices on posts accessed in Singapore that suggested a new variant of COVID-19 had originated in Singapore. 

Editorial standards