Social media platforms that use opaque algorithms to spread harmful content should be reined in, otherwise they may trigger a growing number of violent events such as the attacks on the US Capitol Building that occurred last January, according to Facebook whistleblower Frances Haugen.
According to Haugen, events like the Capitol riots, and other social media-fuelled conflicts, are a foretaste of what's yet to come.
"I have no doubt that the events we're seeing around the world, things like Myanmar and Ethiopia, those are the opening chapters," said Haugen. "Because engagement-based rating does two things. One, it prioritises and amplifies divisive, polarising, extreme content. And two, it concentrates it."
SEE: Twitter stock price up after meeting revenue, user growth expectations
Haugen was speaking in London before the UK Parliament as part of an investigation into the draft Online Safety Bill that was put forward by the government earlier this year. This bill proposes to force companies to protect their users from harmful content ranging from revenge porn to disinformation, through hate speech and racist abuse.
Parliamentarians were taking evidence from Haugen because she has recently come to the fore as the whistleblower behind bombshell leaked internal documents from Facebook, including internal files, draft presentations, research and staff communications, which she obtained while working as the lead product manager for Facebook's civic misinformation team.
Now known as the Facebook Files, the leaks were published by The Wall Street Journal and explore a variety of topics, including the use of different content moderation policies for high-profile users, the spread of misinformation and the impact of Instagram on teenagers' mental health. The disclosures have even become a catalyst for a Senate inquiry into Facebook's operations.
In this context, said Haugen, governments have to step up and implement stricter regulation. "I came forward now because now is the time to act," said Haugen. "The failures of Facebook are making it harder to act."
Haugen argued that the social media giant is "unquestionably" making hate worse, in particular because of its use of an engagement-based ranking algorithm, which pushes content that is likely to create more engagement further towards the top of users' timelines.
Because extreme content tends to be more viral, this can create an echo-chamber effect, meaning that users can be pushed down a rabbit-hole and end up consuming content that's increasingly polarising and divisive.
For example, someone looking for healthy recipes could start seeing content related to anorexia; and someone reading right-wing content could be pushed towards extreme-right posts. The problem isn't limited to Facebook: similar allegations were previously made by ex-Google software engineer Guillaume Chaslot against YouTube's recommendation algorithm.
"The danger with Facebook is not individuals saying bad things; it is about the systems of amplification that disproportionately give people saying extreme, polarising things the largest megaphone in the room," said Haugen.
The issue also expands to paid-for advertising, according to the whistleblower. With divisive ads more likely to create engagement, it is much cheaper to run "angry" advertising campaigns, which led Haugen to say that the current system is subsidising hate on social media platforms.
This is something that Facebook has refuted: in the week preceding Haugen's appearance in the UK Parliament, the social media giant published a report claiming that hate speech prevalence has dropped by almost 50% on the platform over the past three quarters and now accounts for only 0.05% of all content viewed.
A Facebook spokesperson said: "Contrary to what was discussed at the hearing, we've always had the commercial incentive to remove harmful content from our sites. People don't want to see it when they use our apps and advertisers don't want their ads next to it. That's why we've invested $13 billion and hired 40,000 people to do one job: keep people safe on our apps."
But Haugen argues Facebook won't self-regulate in order to protect its users, because lower engagement rates are contradictory to the company's business model. "Facebook has been unwilling to accept even little slivers of profit being sacrificed for safety," she said, falling in line with claims that she made previously that the company was "morally corrupt" and chose to grow profits at all costs.
To avoid the escalation of events into further violent protests, therefore, will require government action. Haugen applauded the UK's efforts in drafting the Online Safety Bill, calling the proposed laws "world-leading" when it comes to regulating social platforms, and in particular highlighting the need to mandate a duty of care on companies like Facebook to protect users.
"I can't imagine Mark isn't paying attention to what you're doing," said Haugen when asked whether the bill might be keeping Facebook CEO Mark Zuckerberg awake at night.
Haugen recommended that the Online Safety Bill should include mandatory risk assessments for engagement-based ranking systems, which would be overseen by external regulators rather than boards within Facebook; and that concerns about paid-for advertising be included in the bill.
She also suggested requiring Facebook to make data available to researchers outside of the company, to allow the investigation of potential issues from the outside, and recommended the mandatory moderation of Facebook groups when they exceed a certain number of users.
SEE: Facebook goes in for a face lift
Finally, Haugen addressed the issue of end-to-end encryption, which has sparked controversy since the publication of the draft bill. Freedom of speech groups, in effect, have voiced concern that the bill will require abolishing end-to-end encryption, to enable social media platforms to scan private messages and search for harmful content – at the cost of user privacy.
"I support access to end-to-end encryption and I use open-source end-to-end encryption every day," said Haugen. "My social support network is currently on an open-source end-to-end encryption service."
Facebook's plans for end-to-end encryption, continued Haugen, are problematic because the product is not open-source, making it impossible to verify the degree to which users will effectively be protected. This may result in some users sharing sensitive information online while thinking that their data is encrypted, she argued, when in fact it may be in danger of being read by third parties.
Facebook, for its part, has welcomed the UK's attempt to regulate social media platforms. "While we have rules against harmful content and publish regular transparency reports, we agree we need regulation for the whole industry so that businesses like ours aren't making these decisions on our own," said a Facebook spokesperson. "The UK is one of the countries leading the way and we're pleased the Online Safety Bill is moving forward."