X
Business

Federal hot water for Facebook over emotional manipulation experiment

ANALYSIS: An emotional manipulation experiment Facebook conducted on 689,003 unknowing users has drawn scrutiny from Washington in a formal letter to the FTC.
Written by Violet Blue, Contributor
Facebook experiment FTC complaint
Facebook's "Fail Harder" sign, via CNET.

 

An emotional manipulation experiment Facebook conducted on 689,003 unknowing users in 2012, only recently brought to light, has now drawn scrutiny from Washington.

On Friday, Senator Mark Warner (D-Va.) filed a formal complaint letter (.PDF) addressed to the Federal Trade Commission calling the experiment's transparency, ethics and accountability into question.

According to reports, it is not clear whether Facebook users were adequately informed and given an opportunity to opt in or opt out.

I also have concerns about whether or not Facebook responsibly assessed the risks and benefits of conducting this behavioral experiment, as well as the ethical guidelines, if any, that were used to protect individuals.

Senator Warner cautioned the FTC, saying that experiments such as these are a "slippery slope" and expressed serious concerns about "future studies like this, without proper oversight or appropriate review," and their impact on consumers.

The Senator's letter regarding Facebook's "Emotional Contagion" experiment follows a media firestorm of public outrage that began June 26, a formal FTC complaint from the Electronic Privacy Information Centre, and in the UK, an investigation by the Information Commissioner's Office.

In its "emotional contagion experiment" Facebook tampered with the emotional well-being of 689,003 users to see how their emotions could be controlled; Facebook's hypothesis amounted to "let's see if we can plant unhappiness and make it spread."

According to Facebook's researchers, the unknowing users automatically consented to the emotional manipulation experiment because they were Facebook users. Consent was indicated, they said, when any user clicked "agree" to Facebook's 9,000-word Terms of Use agreement when signing up, and by continuing to use the site after any Terms updates.

"Who knows what other research they're doing?"

Everyone except Facebook agrees that what Facebook did was alarming, unethical and certainly carried some risk to its users' mental health.

When news of the study hit, Cornell University, formerly supportive in its association with the experiment, issued a statement distancing itself from involvement.

Facebook was then shown to have changed its Terms to include "research" [on its users] only after the experiment had been done.

Those outside of Facebook involved with the experiment have been accused of approval-laundering by respected academics.

Susan Fiske, the professor of psychology at Princeton University who edited the study for Proceedings of the National Academy of Sciences of America, said "I'm still thinking about it and I'm a little creeped out, too."

Fiske told The Atlantic on June 28:

I had not seen before, personally, something in which the researchers had the cooperation of Facebook to manipulate people...

Who knows what other research they're doing?

Proceedings of the National Academy of Sciences of America issued a formal statement to press June 3 saying it had ethical concerns, "that the collection of the data by Facebook may have involved practices that were not fully consistent with the principles of obtaining informed consent and allowing participants to opt out."

The Linguistic Inquiry Word Count (LIWC2007) tool, used by Facebook to monitor and manipulate user emotions in the experiment, has been shown to be wholly inaccurate and inappropriate for Facebook's intended use.

A disgraceful time to "lean in"

Reactions from Facebook's executives have eroded the company's reputation further.

Scientist Adam Kramer, the experiment's primary author, responded June 29 by posting a non-apology to Facebook that utterly missed the point, saying they were sorry about the way they had described the experiment while attempting to re-frame the concept of user consent as if it was merely red tape.

In trademark Facebook spin, he blamed the public outrage on bad representation, as if news of emotional tampering in people's day-to-day lives was a trivial misunderstanding that only anxious people worried about.

On July 2, Facebook Chief Operating Officer Sheryl Sandberg sandbagged Kramer's broken PR strategy, again trying to re-cast the tsunami of public outrage as a miscommunication about the study's description.

Which would make sense if Facebook had actually communicated to the public, or its users, about the experiment in the first place.

The Washington Post reported:

On Wednesday, Facebook’s second-in-command, Sheryl Sandberg, expressed regret over how the company communicated its 2012 mood manipulation study of 700,000 unwitting users, but she did not apologize for conducting the controversial experiment.

It's just what companies do, she said.

July 3, Facebook's Global Head of Policy Monika Bickert grabbed the shovel of callous irresponsibility from Sandberg and dug deeper, saying the nonconsensual mental health experiment was an example of Facebook's "innovation."

Latest review

Bickert — somewhat presciently in light of Senator Warner's letter — also told the audience at the Aspen Ideas Festival, "it's concerning when we see legislation that could possibly stifle that sort of creativity and that innovation."

Warner has a track record of heavy involvement in the Senate regarding consumer protections and cybercrime. This past week, the Senate Select Committee on Intelligence included his amendment to produce a comprehensive report of digital security threats and cybercrime in their bipartisan cybersecurity package.

Earlier this year, Senator Warner chaired a Senate Banking subcommittee hearing on the recent massive credit and debit card security breaches impacting major retailers like Target and Neiman Marcus and millions of American consumers.

In a release accompanying his FTC letter Warner said,

I don’t know if Facebook's manipulation of users’ news feeds was appropriate or not.

But I think many consumers were surprised to learn they had given permission by agreeing to Facebook’s terms of service.

And I think the industry could benefit from a conversation about what are the appropriate rules of the road going forward.

If the cavalier attitudes expressed in Facebook's public statements about the matter are any indication, we'll need more than a conversation to prevent Facebook — and its ilk — from railroading academics, deceiving press and putting users at risk.

It's clear that Facebook is in a self-fulfilling bubble, much like Google, where it believes human beings are products; that they're only freaking out because they hate change or "don't get it" — and Facebook's ways of relating to the world have been severed in ways that facilitate a blatant disregard for the sanctity of other people's lives.

Because what's most foretelling of individual suffering, ultimately, is not the surveillance, the lying, or the messing with our heads, but the indifference of those in control.

See more ZDNet coverage:

Editorial standards