Key takeaways from damning UK report on Facebook’s world of “digital gangsters”

The committee report on fake news and data misuse says Facebook maximizes revenue “at all costs” -- even when the cost is user privacy and trust.
Written by Charlie Osborne, Contributing Writer

The UK's Digital, Culture, Media and Sport select committee has finished an 18-month investigation into fake news, data sharing, and disinformation and has proven heavily critical of Facebook's business values and data-sharing practices. 

The 111-page report (.PDF), conducted by UK parliamentary authorities, accuses Facebook of considering profit "before anything else."

Facebook CEO Mark Zuckerberg was deemed a figure who continually "fails to show the levels of leadership and personal responsibility that should be expected from someone who sits at the top of one of the world's biggest companies."

The investigation will likely make uncomfortable reading for the social networking giant, as a large section of the report focuses on how Facebook handled the Cambridge Analytica scandal, as well as the dubious relationship between the company and app developers over the last decade.

UK regulators say that Facebook has deliberately sought to "frustrate" the committee during the investigation by "giving incomplete, disingenuous and at times misleading answers to our questions," and the report even goes so far to say that Zuckerberg has shown "contempt" towards the governing body.

See also: Facebook tackles developer databases leaking at least one million user records

"Even if Mark Zuckerberg doesn't believe he is accountable to the UK Parliament, he is to the billions of Facebook users across the world," Damian Collins MP, Chair of the DCMS Committee said. "Evidence uncovered by my committee shows he still has questions to answer yet he's continued to duck them, refusing to respond to our invitations directly or sending representatives who don't have the right information."

Despite Facebook's apparent attempts to scupper the investigation, the report concludes that "Facebook intentionally and knowingly violated both data privacy and anti-competition laws" and also calls for an overhaul of regulation in the industry.

The key takeaways of the report are below:

  • Facebook is either "unwilling" or "unable" to prevent malicious content, including revenge porn, hate speech, and propaganda spread by sources including Russia.
  • Facebook should not be allowed to behave like 'digital gangsters' online, and company representatives should not be able to "consider themselves to be ahead of and beyond the law."
  • The company has "a fundamental weakness in managing its responsibilities to the people whose data is used for its own commercial interests," and is only moved to act when " serious breaches become public."
  • The Cambridge Analytica scandal was facilitated by Facebook's policies and a business model which made "data abuses easy."
  • Six4Three was one among some developers of apps considered "too successful" which were then "starved" of data by Facebook, while others were forced to pay a high price for access to information.
  • Facebook was also willing to "override its users' privacy settings in order to transfer data to some app developers."
  • Facebook has taken "aggressive positions" against direct competitors, leading to data access denial -- or acquisitions.  
  • Facebook gained a "huge financial advantage" by collecting user data from sources including Android handsets and Onavo, and also considered granting Tinder access to user data in return for using one of its trademarks, Moments.

CNET: UK concludes using Huawei in 5G is a manageable risk, report says

The committee has suggested that a "Compulsory Code of Ethics" should be established for technology companies which, overseen by an independent regulator, would give law enforcement the power to launch legal cases against organizations which fail to meet standards which would ensure user trust and data privacy, as well as tackle misinformation and fake news.

In addition, the report says that social media networks should be "obliged to take down known sources of harmful content, including proven sources of disinformation."

Should companies fail to comply, the committee says they should face heavy fines -- and a tech 'levy' of two percent should be introduced to pay for the extra workload of UK regulators.  

"Democracy is at risk from the malicious and relentless targeting of citizens with disinformation and personalized 'dark adverts' from unidentifiable sources, delivered through the major social media platforms we use every day," Collins says. "Much of this is directed from agencies working in foreign countries, including Russia. The big tech companies are failing in the duty of care they owe to their users to act against harmful content and to respect their data privacy rights."

TechRepublic: Network recovery advice: Experts weigh in

Facebook said in a statement that the company was "pleased to have made a significant contribution" to the investigation, and added that the firm was "open to meaningful regulation and support the committee's recommendation for electoral law reform."

"We have already made substantial changes so that every political ad on Facebook has to be authorized, state who is paying for it and then is stored in a searchable archive for seven years," Facebook said. "No other channel for political advertising is as transparent and offers the tools that we do."

Facebook's worst privacy scandals and data disasters

Previous and related coverage

Editorial standards