Federal hot water for Facebook over emotional manipulation experiment

Federal hot water for Facebook over emotional manipulation experiment

Summary: ANALYSIS: An emotional manipulation experiment Facebook conducted on 689,003 unknowing users has drawn scrutiny from Washington in a formal letter to the FTC.

SHARE:
Facebook experiment FTC complaint
Facebook's "Fail Harder" sign, via CNET.

 

An emotional manipulation experiment Facebook conducted on 689,003 unknowing users in 2012, only recently brought to light, has now drawn scrutiny from Washington.

On Friday, Senator Mark Warner (D-Va.) filed a formal complaint letter (.PDF) addressed to the Federal Trade Commission calling the experiment's transparency, ethics and accountability into question.

According to reports, it is not clear whether Facebook users were adequately informed and given an opportunity to opt in or opt out.

I also have concerns about whether or not Facebook responsibly assessed the risks and benefits of conducting this behavioral experiment, as well as the ethical guidelines, if any, that were used to protect individuals.

Senator Warner cautioned the FTC, saying that experiments such as these are a "slippery slope" and expressed serious concerns about "future studies like this, without proper oversight or appropriate review," and their impact on consumers.

The Senator's letter regarding Facebook's "Emotional Contagion" experiment follows a media firestorm of public outrage that began June 26, a formal FTC complaint from the Electronic Privacy Information Centre, and in the UK, an investigation by the Information Commissioner's Office.

In its "emotional contagion experiment" Facebook tampered with the emotional well-being of 689,003 users to see how their emotions could be controlled; Facebook's hypothesis amounted to "let's see if we can plant unhappiness and make it spread."

According to Facebook's researchers, the unknowing users automatically consented to the emotional manipulation experiment because they were Facebook users. Consent was indicated, they said, when any user clicked "agree" to Facebook's 9,000-word Terms of Use agreement when signing up, and by continuing to use the site after any Terms updates.

"Who knows what other research they're doing?"

Everyone except Facebook agrees that what Facebook did was alarming, unethical and certainly carried some risk to its users' mental health.

When news of the study hit, Cornell University, formerly supportive in its association with the experiment, issued a statement distancing itself from involvement.

Facebook was then shown to have changed its Terms to include "research" [on its users] only after the experiment had been done.

Those outside of Facebook involved with the experiment have been accused of approval-laundering by respected academics.

Susan Fiske, the professor of psychology at Princeton University who edited the study for Proceedings of the National Academy of Sciences of America, said "I'm still thinking about it and I'm a little creeped out, too."

Fiske told The Atlantic on June 28:

I had not seen before, personally, something in which the researchers had the cooperation of Facebook to manipulate people...

Who knows what other research they're doing?

Proceedings of the National Academy of Sciences of America issued a formal statement to press June 3 saying it had ethical concerns, "that the collection of the data by Facebook may have involved practices that were not fully consistent with the principles of obtaining informed consent and allowing participants to opt out."

The Linguistic Inquiry Word Count (LIWC2007) tool, used by Facebook to monitor and manipulate user emotions in the experiment, has been shown to be wholly inaccurate and inappropriate for Facebook's intended use.

A disgraceful time to "lean in"

Reactions from Facebook's executives have eroded the company's reputation further.

Scientist Adam Kramer, the experiment's primary author, responded June 29 by posting a non-apology to Facebook that utterly missed the point, saying they were sorry about the way they had described the experiment while attempting to re-frame the concept of user consent as if it was merely red tape.

In trademark Facebook spin, he blamed the public outrage on bad representation, as if news of emotional tampering in people's day-to-day lives was a trivial misunderstanding that only anxious people worried about.

On July 2, Facebook Chief Operating Officer Sheryl Sandberg sandbagged Kramer's broken PR strategy, again trying to re-cast the tsunami of public outrage as a miscommunication about the study's description.

Which would make sense if Facebook had actually communicated to the public, or its users, about the experiment in the first place.

The Washington Post reported:

On Wednesday, Facebook’s second-in-command, Sheryl Sandberg, expressed regret over how the company communicated its 2012 mood manipulation study of 700,000 unwitting users, but she did not apologize for conducting the controversial experiment.

It's just what companies do, she said.

July 3, Facebook's Global Head of Policy Monika Bickert grabbed the shovel of callous irresponsibility from Sandberg and dug deeper, saying the nonconsensual mental health experiment was an example of Facebook's "innovation."

Bickert — somewhat presciently in light of Senator Warner's letter — also told the audience at the Aspen Ideas Festival, "it's concerning when we see legislation that could possibly stifle that sort of creativity and that innovation."

Warner has a track record of heavy involvement in the Senate regarding consumer protections and cybercrime. This past week, the Senate Select Committee on Intelligence included his amendment to produce a comprehensive report of digital security threats and cybercrime in their bipartisan cybersecurity package.

Earlier this year, Senator Warner chaired a Senate Banking subcommittee hearing on the recent massive credit and debit card security breaches impacting major retailers like Target and Neiman Marcus and millions of American consumers.

In a release accompanying his FTC letter Warner said,

I don’t know if Facebook's manipulation of users’ news feeds was appropriate or not.

But I think many consumers were surprised to learn they had given permission by agreeing to Facebook’s terms of service.

And I think the industry could benefit from a conversation about what are the appropriate rules of the road going forward.

If the cavalier attitudes expressed in Facebook's public statements about the matter are any indication, we'll need more than a conversation to prevent Facebook — and its ilk — from railroading academics, deceiving press and putting users at risk.

It's clear that Facebook is in a self-fulfilling bubble, much like Google, where it believes human beings are products; that they're only freaking out because they hate change or "don't get it" — and Facebook's ways of relating to the world have been severed in ways that facilitate a blatant disregard for the sanctity of other people's lives.

Because what's most foretelling of individual suffering, ultimately, is not the surveillance, the lying, or the messing with our heads, but the indifference of those in control.

See more ZDNet coverage:

Topics: Big Data, Social Enterprise

Kick off your day with ZDNet's daily email newsletter. It's the freshest tech news and opinion, served hot. Get it.

Talkback

53 comments
Log in or register to join the discussion
  • Loved "the shovel of callous irresponsibility" bit

    Great article.
    Rabid Howler Monkey
    • Thank you!

      The description fit. I'm astonished (and disgusted) Facebook's public faces think reacting like that is in any way appropriate.
      Violet_Blue
      • Violet Blue, Please include me in any Class Action suit!

        Thank you for keeping me up on this subject, Facebook (and Google) etc... seem to step all over users constitutional rights then claim "You signed away all rights to become a member!!!
        jing714
  • Couldn't have been written better

    When an article so clearly and completely covers an issue, there is little one can add......

    I am always astonished that a company/corporation can leverage irresponsible behavior by pointing to their TOS as though a commandment has been decreed on high. It's about time that they are held accountable for holding people hostage to an unethical and largely unseen term of agreement.
    skypilota72
    • Payout

      I agree whole heartily. It's gotten to the point of if I can run a program via someone else's jurisdiction, like certain people from NJ, then I'll say it's not my fault. Maybe Mr. Z can dole out a couple thousand dollars to each person, and call it good.
      Razilistic
    • I couldn't agree with you more.

      ...and especially when they keep changing the terms, while making continued use our legally binding agreement with them. Could you imagine if you signed a book contract, and then a year later your publisher changed it, and that your agreement was "confirmed" by continuing to have them sell your books?
      Violet_Blue
    • It sucks that FB got caught and singled out

      With this experiment but its done every day every minute on TV, radio, and now main stream on social media sites. TOS aren't read by probably 98% (no proof of that but just a guess) of the people who use FB. I did and don't agree with it from day 1 of me signing up but in order to do my own digging into them, I had to.

      So sure morally I don't agree with how and what they did but from a legal side I see no wrong doing. Everyone gets notified that changes were made to TOD so its up the the user to read that lengthy garbady goop. But even if one did read it, more than likely they still would use the service even if they didn't agree with them because it is/was the thing to be on or have a account to. Just my thought though.
      Free Webapps
      • The difference here is

        That Facebook users were subjected to an altered ToS AFTER they had been used in this experiment. Facebook makes the majority of its money from ad revenue, so I don't think it is too much to ask that they request permission specifically to use people in an experiment and give them a chance to opt-out.
        Iman Oldgeek
        • users were subjected to an altered ToS AFTER...

          What difference would it have made if Facebook changed the TOS before, even the day of the experiment? Would anyone notice? Would everyone still be as outraged because you didn't know or have time to read about the change? Just curious...

          I am not condoning what they did by any means. I also do not use Facebook.
          sjohnson@...
        • I will point out here, again,

          that even facebooks current terms of service (that were not in effect when the experiment was conducted) covers "research" which is to say, "analysis" not "experimentation" which is to say "manipulation". The difference is quite substantial and Facebook is aware that it is but they wouldn't dare say "experimentation" or "manipulation" in the TOS otherwise people would likely object.
          techadmin.cc@...
  • EVERY advertising research is the same...

    An attempt to manipulate the viewers emotions...

    Promoting insecurity that can be cured by the product...

    Promoting security by a product...

    Be remade into something else... feel better about yourself...

    So what is the difference?
    jessepollard
    • As good a place as any to note that Google

      has failed to condemn this experiment conducted by Facebook. Just as Google failed to condemn Facebook for creating and maintaining shadow profiles.

      This must be what Ms. Blue means by "Facebook — and its ilk".
      Rabid Howler Monkey
      • What has Google do to with any of that.

        It isn't their business.

        And if they did say anything (either way) they could be opened up to lawsuits over "interfering with business".
        jessepollard
        • Google condemning these actions by Facebook

          would be a great way to get Facebook users to defect and join Google+. In other words, grow the Google+ user base.

          Why hasn't Google condemned Facebook? Either it has taken some of the actions which were taken by Facebook or it wants to keep its options open.
          Rabid Howler Monkey
          • So

            So, they would be manipulating people's emotions by doing it?
            Buster Friendly
          • It would be illegal.

            tortuous interference with their business...
            jessepollard
          • People already know Google

            And know that Google+ would be 100x worse than Facebook. If they could get volunteer members instead of hooking them through YouTube ;)
            Iman Oldgeek
          • Rabid Howler Monkey

            You need help.
            techadmin.cc@...
        • Google! Facebook! Poach each other users? Never.

          More than likely they have a private pact to head off Armageddon between themselves.

          Google does try to emulate Facebook and if Facebook does not face any punishments for this clandestine experiment, then it's open season on Google users as well.
          Auna
      • Google

        @Rabid Howler Monkey - yes, that's exactly what I meant.

        I've seen individual Google employees openly condemning Facebook over this (on Twitter and Google+), but not the company. Then just yesterday I found an NPR article about Google's "experimental newsroom" where they're purposely starting to avoid negative headlines: http://www.npr.org/blogs/alltechconsidered/2014/07/09/330003058/in-google-newsroom-brazil-defeat-is-not-a-headline

        We have to wonder if Google is quiet because they want to see what user manipulation they can get away with, too.
        Violet_Blue