X
Business

Facebook's flip-flops over beheading videos may turn advertisers away

Facebook has shown lack of direction over its decision to allow graphic content images and videos to appear on your -- and your kids -- news feeds next to ads from major brands.
Written by Eileen Brown, Contributor

Facebook has been flip-flopping over a decision it made in May this year to allow videos of real-life decapitation. Then Facebook promised to conduct a review in response to any complaints it received.

Facebook_Headquarters_Menlo_Park_detail wikimedia commons
Image: Soerfm: Wikimedia Commons

Last week a graphic video of a woman being beheaded started to appear on Facebook feeds.

Late on Tuesday in response to a storm of protest about the video, Facebook said that it had “concluded that this content improperly and irresponsibly glorifies violence. For this reason, we have removed it.”

Facebook has removed this particular video but its decision to allow graphic content to appear on users feeds could have far reaching issues for Facebook financially.

In its S-1 filing to the Securities and Exchange Commission before its IPO in May 2012 Facebook stated that its financial results could be significantly harmed if its users decreased ‘their level of engagement with Facebook”.

It specifically called out that its business could be harmed if it adopted “policies or procedures related to areas such as sharing or user data that are perceived negatively by our users or the general public”.

Facebook wants its users to spend as long as possible looking at its site, reading and sharing posts and images. Users are more likely to become aware of the ads on the sidebar, sponsored links or promoted posts at the top of their Facebook feed.

Advertising is really important to Facebook. The S-1 filing states that Facebook generated 85 percent of its revenue from advertising in 2011. It is a risk to Facebook's growth if advertisers turn away.

The risk factors section of the filing also states that Facebook “generate(s) a substantial majority of our revenue from advertising. The loss of advertisers, or reduction in spending by advertisers with Facebook, could seriously harm our business”.

Advertisers will not be happy if their promoted posts or sponsored stories appear in a users feed next to an image of graphic violence or a decapitation video. Brands can not control which post their story appears next to. 

The potential for brand damage is huge.

And brands do not want to be associated with something out of their control that has the potential to damage their brand image alongside Facebook. Ads relevant to your likes appear on your Facebook page.

Brands such as Nationwide, Nissan and Zipcar have already responded to this and other graphic content by controlling or removing their ads from the site.

In May 2012 the Everyday Sexism project posted an open letter to Facebook asking its users to make brands aware when their ads appeared next to images encouraging rape and violence towards women or other harmful content.

Within a week Marne Levine, VP of Global Public Policy at Facebook posted a note to say that Facebook defined harmful content as “anything organizing real world violence, theft, or property destruction, or that directly inflicts emotional distress on a specific private individual (e.g. bullying)”. 

Advertisers may turn away from Facebook and spend their marketing money elsewhere, but parents should also be very concerned about Facebook’s decision.

Last Friday Facebook relaxed its rules allowing teens to share content more broadly. Facebook users between the ages of 13 and 17 can now share posts publically. Previously teens could only share posts between friends and friends of friends.

Facebook has also “thought a lot about” opening up the site to children under the age of 13 according to its manager of privacy and safety.

There is the potential that your kids might now see graphic content in their Facebook feeds. Once these images are seen, the impact on your children might be huge.

Facebook’s community standards mention graphic content and why people share these types of experiences. It says:

“Sometimes, those experiences and issues involve graphic content that is of public interest or concern, such as human rights abuses or acts of terrorism. In many instances, when people share this type of content, it is to condemn it.

However, graphic images shared for sadistic effect or to celebrate or glorify violence have no place on our site.

When people share any content, we expect that they will share in a responsible manner. That includes choosing carefully the audience for the content.

For graphic videos, people should warn their audience about the nature of the content in the video so that their audience can make an informed choice about whether to watch it”.

Facebook has added a warning to videos containing graphic content. Users have to click on the warning before seeing the content. A click is all that is needed.

Facebook really needs to get its act together over its changing decisions around allowing graphic content on its site.

Allowing free speech is laudable, but alienating the advertisers which provide you with the majority of your revenue is short sighted and does not make good business sense for such a huge company.

Do you trust the friends on your Facebook feeds to “choose carefully” the content for you or your kids? Or do you rely on your friends sharing whatever they choose to — to whoever they choose to share it with.

Editorial standards