X
Government

Can regulating Facebook and Twitter stop the spread of fake news?

A report by UK MPs has rejected the idea that tech companies are merely platforms.
Written by Jack Schofield, Contributor

The British parliament's select committee for digital, culture, media and sport (DCMS) has released an 89-page interim report (PDF) which warns that social media companies need to take more responsibility for the content posted by users.

The final report is expected later this year.

While the cross-party committee was initially following up the Cambridge Analytica scandal, its report ranges wider; it is also hoping to influence other countries.

The committee rejects the idea that Facebook, Twitter, Google etc are merely "platforms" who are not responsible for their content.

The report said that social media companies cannot hide behind the claim of being merely a 'platform', and by claiming that they are tech companies and have no role themselves in regulating the content of their sites.

SEE: A winning strategy for cybersecurity (ZDNet special report) | Download the report as a PDF (TechRepublic)

"That is not the case; they continually change what is and is not seen on their sites, based on algorithms and human intervention," it said. However, the report notes that social media companies are also significantly different from the traditional model of a publisher, which commissions, pays for, edits and takes responsibility for the content it disseminates.

The report said: "We recommend that a new category of tech company is formulated, which tightens tech companies' liabilities, and which is not necessarily either a 'platform' or a 'publisher'. We anticipate that the Government will put forward these proposals in its White Paper later this year and hope that sufficient time will be built in for our Committee to comment on new policies and possible legislation."

It goes on: "It is our recommendation that this process should establish clear legal liability for the tech companies to act against harmful and illegal content on their platforms. This should include both content that has been referred to them for takedown by their users, and other content that should have been easy for the tech companies to identify for themselves."

Under the Communications Act 2003, Ofcom is the regulatory body that enforces content standards for TV and radio in the UK, which includes "rules relating to accuracy and impartiality."

Ofcom's chief executive Sharon White said that, in the autumn, she plans to release an outline of how such regulation could work.

The government's Information Commissioner's Office (ICO) already intends to fine Facebook £500,000 "for lack of transparency and security issues relating to the harvesting of data constituting breaches of the first and seventh data protection principles" (ie Cambridge Analytica). The report notes that under the new GDPR, the fine could have been $315 million.

SEE: IT pro's guide to GDPR compliance (free PDF)

So far, the various social media platforms have not really helped the committee. Facebook boss Mark Zuckerberg refused to appear before it, and the report says: "Facebook consistently responded to questions by giving the minimal amount of information possible, and routinely failed to offer information relevant to the inquiry, unless it had been expressly asked for. It provided witnesses who have been unwilling or unable to give full answers to the Committee's questions."

The committee's chair, Damian Collins MP, noted in his third letter to Twitter boss Jack Dorsey: "I'm afraid that the failure to obtain straight answers to these questions, whatever they might be, is simply increasing concerns about these issues, rather than reassuring people."

Regulating social networks would no doubt be expensive for both sides, though the government could cover the cost of regulation through a tax on tech companies and through fines.

And while the international nature of social networking could be a problem, Germany has already shown that it can be done. In January, the German government passed the Network Enforcement Act (NetzDG), which "forces tech companies to remove hate speech from their sites within 24 hours, and fines them 20 million Euros if it is not removed."

The act has been criticized for potentially restricting free speech: against that, the future of Western democracy could be at stake.

PREVIOUS AND RELATED COVERAGE

Google, Facebook hit with serious GDPR complaints: Others will be soon

Facebook nemesis Max Schrems is behind the first challenges to US giants under new European data privacy law.

GDPR attacks: First Google, Facebook, now activists go after Apple, Amazon, LinkedIn

Just days after the new law comes into force, privacy activists add more tech giants to their list of GDPR targets.

Google, Facebook hit with serious GDPR complaints: Others will be soon

Facebook nemesis Max Schrems is behind the first challenges to US giants under new European data privacy law.

What is GDPR? Everything you need to know about the new general data protection regulations

General Data Protection Regulation, or GDPR, is coming. Here's what it means, how it'll impact individuals and businesses - and how to prepare for it.

Facebook makes GDPR push with new data privacy settings

Here's how the social media giant is updating privacy policies ahead of the EU's new data law.

Facebook moving 1.5 billion users away from GDPR protection

Facebook is making changes that will prevent non-European users previously under European laws from being protected by the General Data Protection Regulation.

Editorial standards