The UK's Digital, Culture, Media and Sport select committee has finished an 18-month investigation into fake news, data sharing, and disinformation and has proven heavily critical of Facebook's business values and data-sharing practices.
The 111-page report (.PDF), conducted by UK parliamentary authorities, accuses Facebook of considering profit "before anything else."
Facebook CEO Mark Zuckerberg was deemed a figure who continually "fails to show the levels of leadership and personal responsibility that should be expected from someone who sits at the top of one of the world's biggest companies."
The investigation will likely make uncomfortable reading for the social networking giant, as a large section of the report focuses on how Facebook handled the Cambridge Analytica scandal, as well as the dubious relationship between the company and app developers over the last decade.
UK regulators say that Facebook has deliberately sought to "frustrate" the committee during the investigation by "giving incomplete, disingenuous and at times misleading answers to our questions," and the report even goes so far to say that Zuckerberg has shown "contempt" towards the governing body.
See also: Facebook tackles developer databases leaking at least one million user records
"Even if Mark Zuckerberg doesn't believe he is accountable to the UK Parliament, he is to the billions of Facebook users across the world," Damian Collins MP, Chair of the DCMS Committee said. "Evidence uncovered by my committee shows he still has questions to answer yet he's continued to duck them, refusing to respond to our invitations directly or sending representatives who don't have the right information."
Despite Facebook's apparent attempts to scupper the investigation, the report concludes that "Facebook intentionally and knowingly violated both data privacy and anti-competition laws" and also calls for an overhaul of regulation in the industry.
The key takeaways of the report are below:
CNET: UK concludes using Huawei in 5G is a manageable risk, report says
The committee has suggested that a "Compulsory Code of Ethics" should be established for technology companies which, overseen by an independent regulator, would give law enforcement the power to launch legal cases against organizations which fail to meet standards which would ensure user trust and data privacy, as well as tackle misinformation and fake news.
In addition, the report says that social media networks should be "obliged to take down known sources of harmful content, including proven sources of disinformation."
Should companies fail to comply, the committee says they should face heavy fines -- and a tech 'levy' of two percent should be introduced to pay for the extra workload of UK regulators.
"Democracy is at risk from the malicious and relentless targeting of citizens with disinformation and personalized 'dark adverts' from unidentifiable sources, delivered through the major social media platforms we use every day," Collins says. "Much of this is directed from agencies working in foreign countries, including Russia. The big tech companies are failing in the duty of care they owe to their users to act against harmful content and to respect their data privacy rights."
TechRepublic: Network recovery advice: Experts weigh in
Facebook said in a statement that the company was "pleased to have made a significant contribution" to the investigation, and added that the firm was "open to meaningful regulation and support the committee's recommendation for electoral law reform."
"We have already made substantial changes so that every political ad on Facebook has to be authorized, state who is paying for it and then is stored in a searchable archive for seven years," Facebook said. "No other channel for political advertising is as transparent and offers the tools that we do."