Facebook wants to be crystal clear: the social media platform is stepping up its game to fight harmful content on the internet, and in doing so it wants to work hand-in-hand with regulators. Unfortunately for CEO Mark Zuckerberg, EU Commissioners don't seem to be embracing the message.
The company has just released a white paper which is "charting a way forward on online content regulation", according to Facebook's vice president on content policy Monika Bickert. The paper raises various questions that online regulation should address, and makes recommendations to lawmakers for drafting new rules to fight harmful content.
The document's launch coincides with Zuckerberg's current visit to Munich and Brussels, during which the company's CEO met with high-profile policy-makers such as the EU Commission's executive vice president Margrethe Vestager. The trip is timely, coming just a few days before Brussels publishes new policies on regulating the global digital economy.
On content moderation, Bickert's white paper provides more detail on what Facebook believes such regulation should look like -- and it turns out to be pretty similar to what Facebook already does. In fact, the document repeatedly draws attention to what the social media giant is currently doing well, rather than drawing attention to the gaps in Facebook's content moderation practices.
SEE: IT pro's guide to GDPR compliance (free PDF)
Take the need to remove harmful content as soon as possible after it has been uploaded to a given platform: Bickert stressed that Facebook has developed a technology that for a number of months allowed the platform to remove content related to hate speech, child exploitation or terrorism before anyone had even reported it.
On the importance of making sure that even old content is removed, provided that it's harmful, Bickert pointed to Facebook's successful removal of terror propaganda that has been on the site for several years – the result of "considerable time and effort" from Facebook engineers to develop better detection tools.
The need for external oversight on content moderation policies? Facebook is already working on an independent oversight board, which it likened to a supreme court, to oversee some of the platform's moderation decisions. Publishing periodic reports on the enforcement of policies? Again, the social media giant has its own "Community Standards Enforcement Reports", which go out twice a year.
The white paper's very first suggestion for regulation describes "user-friendly channels for reporting content" – another option that any Facebook user will be well aware is already available on the website.
So Facebook is pitching that it's top of the class when it comes to managing content moderation – and now, it wants governments around the world to match the social media platform's efforts.
Concluding the white paper, Bickert explained that designing new frameworks, if done well, will contribute to the internet's success by sharing responsibilities between companies, governments and civil society. On the other hand, she argued, if "designed poorly", these efforts will have "unintended consequences" such as reducing online safety, stifling expression and slowing innovation.
Speaking at a security conference in Munich, Zuckerberg stressed the importance of tighter regulation to build trust around the internet. The social media giant's CEO then published an op-ed in the Financial Times entitled "Big tech needs more regulation".
"People need to feel that global technology platforms answer to someone," he wrote, "so regulation should hold companies accountable when they make mistakes."
This idea didn't go down well in Brussels. Reacting to Zuckerberg's speech, Europe's commissioner for internal markets Thierry Breton said: "It's not for us to adapt to those companies, but for them to adapt to us."
SEE: Facebook 'deeply concerned' about Singapore directive to block access
"When you have such a big position, you need to anticipate the role that you play in our societies and economies, and not wait for regulators or governments to tell you what you have to do," Breton said. "It's up to them to see the impact of their responsibility before we tell them so."
Breton was matched by the European Commission's vice president Věra Jourová, who said that Facebook couldn't push its responsibilities onto regulators. "It will not be up to governments or regulators to ensure that Facebook wants to be a force for good or for bad," she said.
The social media platform might have to rethink its strategy for wooing policy-makers in Europe. For the moment, though, Facebook told ZDNet that it had nothing further to share on the matter, except that Zuckerberg had met with executive vice president Vestager today and that they had "a good exchange on current issues in the digital sector."