Facebook only got users' permission to do research after 'emotional contagion' study finished

Facebook only got users' permission to do research after 'emotional contagion' study finished

Summary: Facebook says it had permission to run its emotion altering study even without the term "research" in its data use policy, however.

SHARE:
TOPICS: Privacy
3

Facebook didn't include the term "research" in its policy on the use of personal data until four months after its 'emotional contagion' experiment on nearly 700,000 unwitting users had concluded.

Not many people read a company's data use policy when they sign up to a service as such documents are often long, complicated, and lack context. But Facebook's policy has come into focus this week as debate continues over whether its data scientist Adam Kramar had — or even needed — 'informed consent' to run a one-week experiment in January 2012 where the Facebook news feeds of 689,003 users were altered to see how negative or positive posts from friends impacted the subject's subsequent posts.

An ethical debate erupted this week after a blog on Animal New York drew attention to the results of the study, which Kramar and researchers from Cornell University and the University of California published in March. The blog likened the experiment to Facebook "using us as lab rats".

The paper outlines the researchers' exploration into how 'emotional contagion' works when people aren't in physical proximity. Kramar also explained in a blog responding to the outrage that the original paper didn't outline Facebook's motivations adequately: Facebook conducted the experiment, he said, because it was worried that negative posts would cause people to stop using its product.

"The reason we did this research is because we care about the emotional impact of Facebook and the people that use our product. We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out. At the same time, we were concerned that exposure to friends' negativity might lead people to avoid visiting Facebook. We didn't clearly state our motivations in the paper," he wrote.

Critics and supporters of Facebook's right to conduct the study have honed in on the term "research" in its data-usage policy. The key sentence is that the company will use data "for internal operations, including troubleshooting, data analysis, testing, research and service improvement".

Critics have said that simply including the term "research" in dense documents such as data policies is not enough to gain informed consent. Supporters have likened Facebook's experiment to A/B testing, where companies show two different versions to different user groups to see what works best.

However, the focus on the term 'research' may be a little misplaced since, as Forbes reported today, at the time the study ran, Facebook hadn't even included the term and only did so in May 2012, as it outlines here

According to Facebook, while it felt it necessary to add the term, it doesn't believe its inclusion was needed in order to run experiments in the name of improving its own services.

"When someone signs up for Facebook, we've always asked permission to use their information to provide and enhance the services we offer," a Facebook spokesperson told Forbes.

"To suggest we conducted any corporate research without permission is complete fiction. Companies that want to improve their services use the information their customers provide, whether or not their privacy policy uses the word 'research' or not."

Read more on Facebook

Topic: Privacy

Liam Tung

About Liam Tung

Liam Tung is an Australian business technology journalist living a few too many Swedish miles north of Stockholm for his liking. He gained a bachelors degree in economics and arts (cultural studies) at Sydney's Macquarie University, but hacked (without Norse or malicious code for that matter) his way into a career as an enterprise tech, security and telecommunications journalist with ZDNet Australia. These days Liam is a full time freelance technology journalist who writes for several publications.

Kick off your day with ZDNet's daily email newsletter. It's the freshest tech news and opinion, served hot. Get it.

Talkback

3 comments
Log in or register to join the discussion
  • Duh

    Psychology studies generally don't work if the subjects know about them.
    Buster Friendly
  • You missed the point

    The point of the alleged research:
    ""The reason we did this research is because we care about the emotional impact of Facebook and the people that use our product. We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out. At the same time, we were concerned that exposure to friends' negativity might lead people to avoid visiting Facebook. We didn't clearly state our motivations in the paper," he wrote."

    IOW, the point was to provide another evidence point that you can subdue the masses by controlling what the masses see, read, and hear and how far you can take them. Truly George Orwell '1984' stuff. This was not a psychology study. We already know that repeating a lie over and over again can convince people the lie is true. The Nazis taught us that. We already know that being a generally happy, decent person who can be positive with others makes friends and those friends want to continue to be around us. Who doesn't know this? This 'research' was about how to mind control subjects into not leaving Facebook. This was not a study. I don't believe that how people react to positive or negative is really beyond dispute and needs further study.

    And since this is something published after the fact so everyone could read, the purpose of the 'research' is no longer just for Facebook. The more interesting aspect is why this was done at all when the result is pretty predictable.
    eightmileshigh
    • Good job

      Nice...you got Orwell AND Nazis in there. We know very well that people can be manipulated by propaganda. It's been a tool since as far back as we have history.

      This Cornell study was to learn about online social dynamics and mental health.
      Buster Friendly