X
Tech

Facebook will now warn you if you’ve interacted with fake, dangerous coronavirus posts

The fight against COVID-19 scams, misinformation, and fake cures continues.
Written by Charlie Osborne, Contributing Writer

Facebook has announced new measures to tackle the spread of COVID-19 misinformation by alerting users when they have interacted with fake or dangerous content. 

On Thursday, Facebook's vice president of Integrity, Guy Rosen, said in a blog post that the social media giant is going to start notifying users when they have liked, reacted to, or commented on debunked coronavirus-related content. 

The firm's team of moderators are constantly removing misinformation surrounding the pandemic, including fake 'cure-all' or preventative product promotion, posts encouraging the use of dangerous substances such as bleach as curative, fake statistics, unfounded conspiracy theories, and more. 

If a user has previously interacted with content that has since been proven as false by the World Health Organization (WHO) and removed, Facebook will then forward a message containing a shareable link to the WHO's page concerning the COVID-19 rumor mill and fact-checking resources. 

screenshot-2020-04-17-at-11-01-00.png

Facebook says the posts will appear in the coming weeks. 

CNET: Investing and saving during coronavirus: Here's what to prioritize

"These messages will connect people to COVID-19 myths debunked by the WHO including ones we've removed from our platform for leading to imminent physical harm," Rosen said. "We want to connect people who may have interacted with harmful misinformation about the virus with the truth from authoritative sources in case they see or hear these claims again off of Facebook."

During March alone, warnings were slapped on roughly 40 million misinformation posts relating to COVID-19 and "hundreds of thousands" of pieces of content that could cause harm have been wiped off the platform. 

Stopping the spread of misinformation is not something Facebook moderators can manage alone, and as a result, the social network has partnered with over 60 external fact-checking organizations worldwide.  

Facebook says that over two billion users have already been redirected to WHO myth-busting resources. Across Facebook and Instagram, over 350 million people have clicked through to the company's official COVID-19 Information Center.

TechRepublic: Coronavirus: What business pros need to know

The changes to Facebook's platform come on the heels of a new report issued by Avaaz that deems the company's actions "commendable," but claims there are still "significant delays" in taking down coronavirus misinformation. 

The research suggests that it can currently take up to 22 days for warning labels to appear on dubious COVID-19 content. 

In related news this week, Facebook chose to cancel all physical events, such as technology conferences and company meetups, until June 2021 due to the coronavirus outbreak. Any event due to host 50 people or more is now scrapped until next summer and the majority of Facebook employees will continue to work from home in the coming months. 

Innovative projects now online to combat coronavirus outbreak

Editorial standards