Amid outcry that Facebook continues to give a platform to vile and harmful content, the social network on Thursday laid out its broad-based framework for when it does and doesn't censor content.
"We do not... allow content that could physically or financially endanger people, that intimidates people through hateful language, or that aims to profit by tricking people using Facebook," Facebook's VP of Policy Richard Allan wrote in a blog post.
The post comes on the same day Facebook confirmed it will ban websites that share blueprints for 3D-printed guns.
It also follows Facebook's decision earlier in the week to remove pages belonging to Alex Jones, the notorious conspiracy theorist who has propogated the belief that the 2012 Sandy Hook school shooting was a hoax. Facebook said it came to the decision to remove the pages on its own, though the decision was announced shortly after Apple said it would remove Jones' podcasts from its platforms.
"Every policy we have is grounded in three core principles," Allan wrote, "giving people a voice, keeping people safe, and treating people equitably. The frustrations we hear about our policies -- outside and internally as well -- come from the inevitable tension between these three principles."
Allan added, "barring other factors... we lean toward free expression. It's core to both who we are and why we exist."
The basic framework leaves plenty of room for interpretation, as Allan notes.
First and foremost, Facebook censors content when it's necessary to prevent harm. The most obvious example, he said, is credible threats of violence. Hate speech may fall into that category, but it may not, Allan wrote: "It is perhaps one of the most challenging of our standards to enforce because determining whether something is hate speech is so dependent on the context in which it is shared."
Allan also defended Facebook's controversial decision to leave plainly false content on its platform.
"Human rights law extends the same right to expression to those who wish to claim that the world is flat as to those who state that it is round -- and so does Facebook," he wrote. "It may be the case that false content breaks our other rules -- but not always."
Rather than blocking content for being untrue, he explained, Facebook demotes such posts in its News Feed and also points people to other articles on the same subject.
The clarification follows a backlash to Facebook CEO Mark Zuckerberg's insistence that Facebook would not automatically remove content denying the Holocaust.
Chris Matyszczyk has documented for ZDNet how Facebook's treatment of Holocaust deniers has shifted in the past decade, becoming worse as the company grows more ubiquitous and more profitable.
Allan's blog post stressed that Facebook "is not a government" and later on reiterated, "we're not bound by international human rights laws that countries have signed on to." That said, he acknowledged that Facebook has a responsibility to curate the content on its social network as "a platform for voices around the world."