EPIC files complaint over Facebook emotion experiment

The Electronic Privacy Information Centre (EPIC) has called on the US Federal Trade Commission to investigate and take action against Facebook for experimenting on 700,000 users.
Written by Aimee Chanthadavong, Contributor

The Electronic Privacy Information Centre (EPIC) in the US has filed a complaint with the Federal Trade Commission (FTC) over Facebook's psychological experiment on 700,000 users.

In its complaint (PDF), the public interest research centre calls on for the FTC to investigate Facebook's action, with belief the company "purposefully messed with people's minds".

"Facebook conducted the psychological experiment with researchers at Cornell University and the University of California, San Francisco, who failed to follow standard ethical protocols for human subject research," EPIC wrote in its complaint.

The study by the Facebook team tested emotions of users for a week in January 2012.

The researchers wanted to test whether "emotional contagion" occurs outside of in-person interaction between individuals by reducing the amount of emotional content in the news feed.

Facebook tweaked the posts users were receiving on their feeds. Some received posts with positive sentiments, while others received posts with more negative emotions. Facebook wanted to see if this altered people's Facebook behaviour and emotions.

The findings of experiment were published in a paper in the March issue of Proceedings of the National Academy of Sciences.

"At the time of the experiment, Facebook did not state in the Data Use Policy that user data would be used for research purposes," EPIC continued.

"Facebook also failed to inform users that their personal information would be shared with researchers. Moreover, at the time of the experiment, Facebook was subject to a consent order with the Federal Trade Commission which required the company to obtain users' affirmative express consent prior to sharing user information with third parties.

"Facebook's conduct is both a deceptive trade practice under Section 5 of the FTC Act and a violation of the Commission’s 2012 Consent Order."

EPIC has urged the FTC to "impose sanctions, including a requirement that Facebook make public the algorithm by which it generates the news feed for all users".

Similar concerns have also now been expressed by the scientific journal that initially published the findings of the Facebook study. It said it typically publishes experiments that have allowed subjects to opt out of research.

"Based on the information provided by the authors, PNAS editors deemed it appropriate to publish the paper," said a statement by editor-in-chief Inder Verma.

"It is nevertheless a matter of concern that the collection of the data by Facebook may have involved practices that were not fully consistent with the principles of obtaining informed consent and allowing participants to opt out."

Facebook is currently under a 20-year settlement agreement with the FTC. In November 2011, the FTC settled charges that dated back to 2009, when the company made changes to its privacy settings where certain information was previously set to private was made public without informing users about the changes, or getting their approval in advance.

The FTC proposed Facebook to take several steps to make sure it lives up to its promises in the future, including giving users advance warning to any chances, as well as obtain consumers' consent before their information is shared before their privacy settings are overridden.

Editorial standards