Facebook played with our emotions in data experiment. So what?

Facebook played with our emotions in data experiment. So what?

Summary: Facebook experimented with almost 700,000 user emotions in a week long experiment in January 2012. So why is the Internet so upset?

SHARE:

Facebook has published a report showing that it played with data from 689,003 out of its 1.3 billion users in 2012. After it reported its findings, it was accused of emotionally manipulating its users.

facebook data image
Image: National Academy of Sciences

The experiment, published in a paper in the March issue of Proceedings of National Academy of Sciences, inserted emotionally skewed posts in to people’s news feeds, without their knowledge. This was to test what effect that had on the statuses or "likes" that they then posted or reacted to.

“We take privacy and security at Facebook really seriously" ~ Sheryl Sandberg, COO Facebook

The study by the Facebook team tested emotions of users. It ran these tests for a week in  January 2012. 

It wanted to see whether posts with more emotional content were more engaging.

It showed a selection of users different emotional varieties of posts in their feeds. Some users received posts with positive sentiments.

Other users received posts with more negative emotions. Facebook wanted to see if this altered people’s Facebook behaviour and emotions.

The researchers wanted to test whether "emotional contagion" occurs outside of in-person interaction between individuals by reducing the amount of emotional content in the News Feed.

Interestingly enough, the experiment seems to work. People who saw positive posts tended to write in a more positive way, people that saw negative posts did not.

The study found that when positive expressions were reduced in people’s feeds, people produced fewer positive posts and more negative posts. However when negative expressions were reduced, users posted more positive posts.

Facebook has been accused of manipulating emotions. Well, get over it Facebook users. If you are a Facebook user, you willingly give Facebook every bit of data it has about you.

That data, as soon as you press submit, is data that Facebook can do with whatever it wants to. Whilst you might not have explicitly agreed to this at the time you signed up for the service, the agreement to use your data for research is now there in its terms of use.

Facebook’s data tests are not new. Facebook regularly manipulates what you see. It changes its hugely complex algorithm to show you less posts from people you don’t interact with often.

You see less posts from brands that have huge numbers of followers, you see different types of ads in your sidebar.

Facebook has been doing this for years.

It says it does this to give you a better experience on Facebook. These "experiments" on your emotions allow it to gauge how to best present you with the information you want.

The Data Science team at Facebook has a huge amount of data to play with. Facebook was experimenting with showing users items in their feeds that they really wanted to see.

The report shows that "emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness".

The report also states that it provides "experimental evidence that emotional contagion occurs without direct interaction between people".

It says that having exposure to a friend expressing an emotion is sufficient to cause this effect. 

The company uses its Edgerank algorithm carefully to ensure that you see the posts from people that you are more likely to engage with – and more likely to interact with.

You will see the images and videos that you want to share, and you will return to.

The premise here is if you see something in your feed that pleases you, you are more likely to return to Facebook and spend more time there.

This enables Facebook to show you more relevant ads — ads that you are much more likely to click on, thereby increasing Facebook’s revenue.

Sheryl Sandberg assured users that Facebook could not control users and could not control emotions.

"We take privacy and security at Facebook really seriously because that is something that allows people to share opinions and emotions." said Sandberg.

A previous Facebook study showed user influencer over election votes. This led to comments that Facebook had mobilized voters to vote.

Facebook has apologised for the way that this information was released. The Internet has moved on to its next concern.

The Facebook Data Scientists will go on to their next big data crunching experiment with the data we have willingly provided as we interact with our friends and favourite brands.

With — or without our consent.

Topics: Social Enterprise, Big Data, Data Management

Kick off your day with ZDNet's daily email newsletter. It's the freshest tech news and opinion, served hot. Get it.

Talkback

20 comments
Log in or register to join the discussion
  • Reminds me of

    That reminds me of all the subliminal message conspiracy theories back in the 70s and 80s. There was a couple scifi movies about it even.
    Buster Friendly
  • Privacy invasion is unacceptable.

    Don't be stupid by making silly excuses to justify facebook/Google for privacy invasion.

    Facebook must got user consent first before doing experiments. This is 21st century.
    Owl:Net
    • Nope

      That's completely not true. Cornell could maybe if you really stretched definitions be guilty of an ethics violation, but it's not likely since they did not perform the experiment. Marketing is completely about trying different approaches and observing the results. If you think it's something else, explain.
      Buster Friendly
  • Understand research ethics

    You have to understand that every participant in any ethical research project a any university must have been given informed consent. This is clearly defined in the law in the US. That is research subjects understand the project, its aims and how their data is going be used. This clearly hasn't happened and the universities involved would appear to have a serious breach of ethical research practises.
    madscientistdownunder
    • That's not how it work

      Every commercial you see on TV is trying to manipulate your emotions and they observe the results with various mechanism to decide how to adjust it. Web sites works basically the same way. Writing an academic paper on how commercials are done does not mean you were doing the experiments. Having some influence on how it was structured is a grey but that's 100% an issue for Cornell and not Facebook.
      Buster Friendly
  • ..and

    this must be done for each person taking part. Permission to use data is NOT informed consent.
    madscientistdownunder
  • this blog covers the issues well

    http://www.sciencebasedmedicine.org/did-facebook-and-pnas-violate-human-research-protections-in-an-unethical-experiment/

    Agreed its an issue for the universities and the journal itself as Facebook is not received government research money or seeking FDA approval
    madscientistdownunder
    • Did Cornell Univ. receive federal government research money?

      As near as I can tell, the experiment was funded by Facebook, Inc. If this is true, then the "Common Rule" does not appear to be relevant:

      "What is the Common Rule?
      The Common Rule is a short name for “The Federal Policy for
      the Protection of Human Subjects” and was adopted by a
      number of federal agencies in 1991. Each agency incorporated the policy into its own code of Federal Regulations (CFR), with VA adapting it in Title 38 CFR Part 16, and the Department of Health and Human Services (DHHS) in 45 CFR Part 46, Subpart A."

      Source:
      http://www.research.va.gov/programs/pride/resources/Common_Rule_Flyer.pdf
      Rabid Howler Monkey
  • We actually have a model for this type of manipulation.

    We actually have a model for this type of manipulation.

    It's called the Holocaust.

    How many people were manipulated to do something that is morally reprehensible during the Holocaust?

    Too many. Far too many.

    Manipulating people isn't some dreamy harmless thing. It's real, and it's a perfectly valid concern.

    That's the answer to your "So what?"

    I know you WANT us to take this lightly, but that's the wrong approach. In the wrong hands, yes this can go horribly wrong. It's not a light concern, and I think it is perfectly valid for us to question it.

    I know it makes you emotional when people take it seriously, but it's the right thing to do to question it.
    CobraA1
    • LOL

      Take any topic, someone will go Nazi no matter how silly. It's the standard Internet hyperbole also known as Godwin's law.
      Buster Friendly
      • Well, that point sailed over your head.

        Well, that point sailed over your head.

        Sure, it's a rather extreme example - but I think it does make a point: That it's a real concern, and should not be taken lightly.
        CobraA1
        • LOL

          Went from a Godwin to a personal attack? Just stop now and save face.
          Buster Friendly
          • Sounds legit.

            Okay, so take my fist sentence as a personal attack and totally ignore the second?

            Sounds legit.
            CobraA1
          • You weren't personally attacked

            your quick flight to the the ancient Godwin meme, rather than commenting on the substance of CobraA1's comment, was critiqued. There is a difference.
            Mac_PC_FenceSitter
  • Apology

    When even the company who ran the study, that being Facebook, issues an apology for running the experiment, your argument for excusing this sort of ethical malpractice becomes invalid

    http://beforeitsnews.com/science-and-technology/2014/07/news-feed-study-leads-facebook-to-issue-a-public-apology-2705176.html
    ozgothic
    • Proof by pandering??

      You think a proof by pandering is logical? When you do marketing, you say what you think people want to hear and not was is correct. The only thing I see is a bunch of childish academics trying to pass the buck. If Cornell was wrong and the journal failed to check to procedures, they need to admit their mistake instead of blame storming. Working with academics is often like working teenagers that think they can do no wrong.
      Buster Friendly
  • Eileen, do you not get it?

    Manipulating people to be more down that they are can have some pretty serious consequences.

    This is NOT OK and furthermore I suspect you well know it.
    Mac_PC_FenceSitter
    • well... Magazines for women have been doing it for years.

      Promoting overthin, malnourished women as the ideal, and accusing all the rest of being fat slobs.

      Have they been called on it? yes.

      Legal repercussions? nope.
      jessepollard
  • Simple solution

    First - not having a Facebook account solves all these problems if you are concerned about how your data is used and manipulated. People have to understand that nothing is for free.
    You might not pay for your Facebook account directly with money but you do pay for it.

    "Facebook’s data tests are not new. Facebook regularly manipulates what you see. It changes its hugely complex algorithm to show you less posts from people you don’t interact with often."

    And while I don't have a Facebook account the above statement typifies the problems with corporations trying to interpret data they collect. I would want to be in control of who's post I see or don't see. I want to be in control of what products or web pages are displayed to me. I don't want Facebook or Google or Microsoft or Apple or Verison or AT&T or anyone else looking at my data and deciding for me what I get to see or not see on the internet or anywhere else.
    T-Rexx
  • Time to move to Google+

    "Well, get over it Facebook users. If you are a Facebook user, you willingly give Facebook every bit of data it has about you."

    I am against privacy but that has absolutely nothing to do with this. Didn't read past that, this is a troll article.

    This is absolutely scandalous. Move to Google+ people. It's better in the first place.
    JopV