Angry about Facebook's emotion experiment? The problem lies with you as much as Facebook

Angry about Facebook's emotion experiment? The problem lies with you as much as Facebook

Summary: If you don't like Facebook's recent mood manipulation research, then there's one easy answer to get the company to changes its ways.

SHARE:
Should we really be surprised that Facebook has botched handling of users' data?
Should we really be surprised that Facebook has botched handling of users' data? Image: Facebook

So Facebook has been tweaking its users' news feeds to make them feel miserable. Why all the surprise? That company has been making me feel miserable for years.

If I sound like I'm being flippant, please forgive me – I'm really not. The outrage — some manufactured, some genuine — is not misplaced, but the shock that accompanied Facebook's announcement of its recent psychological experiment most definitely is.

My only real puzzlement over this experiment is that Facebook chose to reveal it publicly, without realising the bad feeling it would generate among its users. For a company that prides itself on gathering every gatherable bit of data on its users to develop in-depth profiles of their lives, it spectacularly failed to guess that those users would be unimpressed by it deciding to alter their mood.

Details of the experiment, conducted with Cornell University researchers, emerged over the weekend. Facebook revealed that it had altered the content of nearly 700,000 users' news feeds, deprioritising items of a particular emotional timbre — either happy or sad — over the course of a week to see what effect the changes would have on those users' moods.

According to the study, it turns out we tend to keep in tune with our social environment. If we see our friends are happy, we feel the same too; if they're having a bad day, it rubs off on how we feel as well.

Once the nature of the experiment hit the headlines, a wave of hand wringing followed as users and commentators complained that they felt they had been played; that Facebook was guilty of manipulating them without their knowledge.

Perhaps those decrying the site's actions have forgotten that this is pretty much how Facebook works. Facebook already adjusts what users see according to what its algorithms dictate — the friends we're most in contact with, our interests, the adverts we may respond to.

It regularly tries different layouts and algorithmic tweaks that affect our newsfeeds — it's just A/B testing designed to keep the service sticky, and make sure that Facebook gathers as much as data possible to better target ads. The difference between that and the 'emotional contagion' is that Facebook chose to make users aware of what it was doing.

The presence of university researchers gives this particular giant A/B test a patina of respectability and altruism that its other such experiments lack. While the results of its study may have proved useful to psychologists and other behavioural health experts, that isn't really Facebook's interest here. Remember, it was Facebook's own data scientists that designed the experiments, not Cornell's, who only got access to the results after the experiment concluded.

It's not been shy about expressing its own self-interest either: "The goal of all of our research at Facebook is to learn how to provide a better service," Adam Kramer, one of the data scientists involved in the experiment, wrote in a recent blogpost acknowledging the upset it caused.

Facebook didn't explicitly ask users for their consent to be involved in the experiment, as is regarded as best practice in research involving human subjects. It did get users to give their permission to be involved, however, although only explicitly noting in its data-usage policy that users' information could be used for research after the experiment was concluded. Cue the shock and outrage.

Facebook, however, maintains it already had permission to use data for such purposes. "When someone signs up for Facebook, we've always asked permission to use their information to provide and enhance the services we offer," a spokesperson for the company told Forbes.

"To suggest we conducted any corporate research without permission is complete fiction. Companies that want to improve their services use the information their customers provide, whether or not their privacy policy uses the word 'research' or not."

Again, I can't help but wonder how a company that spends so much time, money, and effort on garnering information on its users can yet understand so little about them, and how Facebook's research into its customers' moods left it misjudging the public's so badly. Even in acknowledging the negative reaction to the experiment, Facebook's spokespeople must tell us the fault lies with our understanding, not their handling, of its data-use policies.

Why is it that Facebook has been so tin-eared with regard to this particular experiment? Because it has been allowed to get away with having scant regard for users' wishes on how it handles their data for so long, why should this time be any different?

Facebook has made unpopular move after unpopular move when it comes to handling users' data — the facial recognition software farrago of a couple of years ago being a perfect example. Facebook users, of course, have the ability to opt out of such new features, but Facebook does tend not publicise the launch date of such new features, nor where users can disable them — generally tickboxes buried in the preferences section. The decision to tell users that their data could be used for research was handled very much in the same way.

Part of the motivation for the 'emotional contagion' experiment was "we were concerned that exposure to friends' negativity might lead people to avoid visiting Facebook", Kramer said. I strongly suspect that Facebook has learnt more than it could have hoped for on that account. Despite the outrage, again, about data-handing, did any of its users close their accounts?

The experiment is a clear demonstration that Facebook has learnt no lessons from its previous data-use foul-ups, and in not deserting the service after seeing their objections ignored, neither have its users. We are getting exactly the social network we deserve. If we want the company to change, there's only one way forward.

Read more on this story

Topics: Privacy, Social Enterprise

Kick off your day with ZDNet's daily email newsletter. It's the freshest tech news and opinion, served hot. Get it.

Talkback

74 comments
Log in or register to join the discussion
  • Well said,

    to me this is the key question; "despite the outrage, again, about data-handing did any of its users close their account?"

    Unfortunately whining and complaining has become an increasingly popular past time in the USA. Instead taking action, many folks just want to complain, probably on facebook!
    Low_tech
    • Close no; Use it, not really

      I've had a Facebook account since it's public inception. This was more of a way to keep in touch internet socially with family and distant friends. After a while I really questioned the personal benefit outside of the "keep in touch". I had better ways to gather, read, discuss and ponder information.
      So Facebook was relegated to very basic " keep in touch" and as a login for some sites.
      Glad I did.
      rhonin
  • closing your account may not actually change their ways

    They already have your data. The experiment continues ...

    {There are users on "dating sites" that have been there for years without a single profile or picture change}

    ... it seems you can checkout but you can never leave.

    If you haven't signed up... you have a way forward... never sign up!
    greywolf7
    • You have a very fatalist view of life..

      I got off Facebook about 3 years ago, primarily because too many people on it use it instead of reality. I am not interested in constant reminders of how little your average "Facebook'er" deals with social reality, personal relationships or communicates with real people and ideas. My bowel movements and minor aches and pains are only barely worthy of my attention. They certainly should not qualify as communication with everyone I know. Get off, and get a life.
      robertcape@...
    • I wish there was a way

      To close my FB account and still have access to the business-related sites that I need and want to follow.
      OldGrayWolf
      • Quite!

        The relief I gained from deleting my account over two years ago is beyond measure.

        I cannot understand why so many sites / businesses only give FB as the way in or ability to comment. Surely they know that they are shutting some of us out...
        dumb blonde
        • I Skip those Sites

          Any Site that only allows Logins via Facebook is not worthy of my attention.
          I refuse to fit into their truncated view of the real demographic makeup out here in cyberspace.
          PreachJohn
  • thoughts

    "If I sound like I'm being flippant, please forgive me – I'm really not."

    Keep repeating that to yourself - if you say it to yourself enough times, it'll become true, right?

    "Facebook didn't explicitly ask users for their consent to be involved in the experiment, as is regarded as best practice in research involving human subjects."

    . . . which should probably be made illegal, to be honest. I'm surprised such a so-called "best practice" has been unquestioned for so long.

    "Facebook's spokespeople must tell us the fault lies with our understanding, not their handling, of its data-use policies."

    Blaming the user is passing the blame rather than accepting it, which is not how things ought to work. Facebook carried out the experiment, the users didn't - therefore yes, absolutely, the fault is squarely on Facebook for the results, including any outrage that may have resulted from carrying out the experiment.

    "despite the outrage, again, about data-handing did any of its users close their account?"

    In something as large as Facebook, it's actually quite likely that some did. It would be pretentious of you to assume that it didn't happen in something as large as Facebook.
    CobraA1
    • Not it shouldn't

      Since a subject being aware of the experiment often invalidates the results, what your asking for is banning the psychological sciences. Is that really what you want?
      Buster Friendly
      • Not true.

        Participants are generally always told they will be part of an experiment. Or that their data may be used for experiments.

        The general idea in this case should have been to ask a broad selection of people if they would like to partake, and then test only a percentage of those people. This is how, for example, experiments into placebo effect are done to my knowledge.
        ramsey2510
        • Not the case

          That's just not the case. You cannot have valid results if people know the situation might be manufactured.
          Buster Friendly
          • Can't agree more

            Psychological experiments work better when the subjects are not informed about the consequences of their behavior.
            notadatascientist
          • You're decades out of date

            No, you're wrong. 'Informed Consent' has been the foundation of research for decades now. This is why we still hear about old experiments like the 'Stanford Prison Experiment' in modern papers/articles because for better or for worse, studies like that where the subjects have no idea what they are subjects of has not been allowed in academic studies for decades. Apparently FB (and other corporations) have found ways around this.
            scottabc
          • It's Cornel

            It was Cornell's experiment and not Facebook despite the fictional headlines on the topic. So, you're saying Cornell has no idea what they're doing?
            Buster Friendly
          • Yes, Cornel has no idea what they are doing

            Proof:

            1. The protocols necessary for a clinical study of any usefulness, were not followed.
            2. The topic does not need testing, for we all know the human race is herdbound, reciprocation is core to our being.
            3. Psychological results which are useful on this topic have been known and used for DECADES by moviemakers, ad men, writers, anyone with something to sell.

            If I am angry and you are around me, you will either be angry at me in reciprocation (if you dislike me or just feel like disliking what I say at that moment), or you will be angry with me (if you fancy some kind of affinity to the 'me' you think you see.

            If it became popular tomorrow to wear panties on your head (male or female version) in order to be 'smart', then worldwide within months, that is what everyone would do.

            Hitler knew this well. So did Stalyin and Mao and FDR and Constantine with his newly-wrangling, 'catholic' church. So did all the Greeks. Puleese.
            brainout
          • Single Blind Is Possible With Volunteers

            You ask for volunteers, you explain that some will have their feeds altered over a time period, in that some messages will change in priority and longevity. You split the volunteers and do nothing to the feeds of 12.5% of them. The other 87.5 have varying degrees of changes of happy and unhappy. You do not tell the volunteers what group they are in and you look at the results for all of them.

            Alternatively, no alteration of feeds, but select a group of random users, qualify their feeds as to degrees of happy/unhappy and look for any correlation in their responses.

            I ask if the assumption is that Facebook can effectively measure how happy the items are, and they have a large dataset, why was altering necessary? But, if it was, then as a matter of experimental ethics and as a matter of not irritating the customer, there should have been explicit consent.
            DannyO_0x98
        • Timing matters

          They are told about the experiment, sometimes after the experiment has taken place. That is why this one was revealed, because it was actually used for scientific merit. $20 says that there is a proviso in the terms and conditions that everyone clicks through stating that Facebook has the right to use the data it collects for scientific purposes.
          richlake
          • Here's an example

            Here's an example why you can't tell them. Say you want to observe reactions to a loud argument in the school library. Do you stand at the door with consent forms, turn away that doesn't sign, and then start the argument? Would you get genuine reactions or would absolutely everyone know it's staged and just laugh? You have to eliminate any chance they might suspect it's not a real situation or you're not doing valid science.
            Buster Friendly
          • You don't have to say there is a loud argument coming

            You can merely tell them an experiment is being conducted. Doing this correctly, with no previous reports of this particular experiment being done, will result in genuine reactions.
            grayknight
          • No.

            There wasn't anything in the T's and C's stating data use for Scientific purposes prior to the experiment. It was added after the experiment took place.

            Facebook used the pretext "to provide a better service" article in the terms and conditions to justify the experiment.

            There was another article posted here yesterday I believe stating so. the UK's Information Commissioner's Office is also probing the incident, due to a possible breach in EU Human Rights Laws.
            ramsey2510