X
Tech

We're all just lab rats in Facebook's laboratory

Facebook has always controlled what we see in our news feeds. Now we know they've experimented on us to see what messages make us sad or happy.
Written by Steven Vaughan-Nichols, Senior Contributing Editor

We're all guinea pigs for Facebook. The world's most popular social network is always twisting our settings to find just the right combination of ads and real posts. That's old news. What we didn't know is that Facebook manipulated over half a million randomly selected users by changing the number of positive and negative posts they saw to see how it would effect their emotional state.

Lab-Rat
Will you be my Facebook friend?

I'm sorry, I wasn't aware I'd signed up to be a Facebook lab rat.

Mind you, I expect Facebook to play games with what I see in my news feed from even my closest friends in their pursuit of the almighty dollar.

Oh, you didn't know that Facebook does that? They do. According to research by Jon Loomer, a Facebook marketing expert, Any given Facebook post may be seen by 27.4 percent of your friends or fans who are online when it's posted.

Six Clicks: How sites secretly collect your data – and how to stop it

What I didn't expect was to find that Facebook played with the emotions of 689,000 of us during one week in January 2012. The scientists involved found, by rigging the users' news feeds, that the ones who got less positive news were inclined to be more negative in their own posts while those who got less bad news were inclined to be happier. What a shock!

What Facebook scientist Adam Kramer and his comrades did was almost certainly legal. If you read the Facebook Data Use Policy's fine print, you'll see that when you sign up for it you've given Facebook not just the right to almost all your personal data for advertising and feature improvement purposes, but also "for internal operations, including troubleshooting, data analysis, testing, research and service improvement." This last one gives Facebook the right to do pretty much anything they want with your data.

Of course, Kramer and Facebook claim that what they did was legal. The relevant phrase in the paper, Experimental evidence of massive-scale emotional contagion through social networks, reads, "Linguistic Inquiry and Word Count software … was adapted to run on the Hadoop Map/Reduce system and in the news feed filtering system, such that no text was seen by the researchers. As such, it was consistent with Facebook’s Data Use Policy, to which all users agree prior to creating an account on Facebook, constituting informed consent for this research."

Really, that one click I made years ago to get on your network was informed consent? Wow, I certainly thought that over carefully before I hit my mouse button. 

Come on Facebook, why the hell didn't you just ask for volunteers? With over a billion users, I think you could easily get a few hundred thousand real volunteers for your study.

Kramer has since... well, I wouldn't call it an apology but he explained that he and his fellow scientists "care about the emotional impact of Facebook and the people that use our product." Besides, "the actual impact on people in the experiment was the minimal amount to statistically detect it — the result was that people produced an average of one fewer emotional word, per thousand words, over the following week."

And this: "Having written and designed this experiment myself, I can tell you that our goal was never to upset anyone. I can understand why some people have concerns about it, and my coauthors and I are very sorry for the way the paper described the research and any anxiety it caused. In hindsight, the research benefits of the paper may not have justified all of this anxiety."

You think?

You know what adds salt to the privacy wound? The study itself appears to be fatally flawed. 

Dr. John Grohol, founder and CEO of Psych Central, a mental health site, stated that, for starters, the Linguistic Inquiry and Word Count program is fatally flawed for this kind of study. And, even if you did buy into the study after that, "you’re still left with research showing ridiculously small correlations that have little to no meaning to ordinary users. For instance, Kramer et al (2014) found a 0.07% — that’s not 7 percent, that’s 1/15th of one percent!! — decrease in negative words in people’s status updates when the number of negative posts on their Facebook news feed decreased. This isn’t an 'effect' so much as a statistical blip that has no real-world meaning."

So Facebook did all this, annoyed God knows how many people, for basically nothing. 

I've never trusted Facebook to keep my data private nor that they'll ever stop manipulating my news feed — if I have to set my Facebook News Feed to Most Recent from Top Stories one more time I'm going to scream! — but I didn't expect to see Facebook playing with my emotions.

Stop it, Facebook. Stop it now. And, never, ever do anything like this again.

Related Stories:

Editorial standards