X
Tech

Angry about Facebook's emotion experiment? The problem lies with you as much as Facebook

If you don't like Facebook's recent mood manipulation research, then there's one easy answer to get the company to changes its ways.
Written by Jo Best, Contributor
Should we really be surprised that Facebook has botched handling of users' data?
Should we really be surprised that Facebook has botched handling of users' data? Image: Facebook

So Facebook has been tweaking its users' news feeds to make them feel miserable. Why all the surprise? That company has been making me feel miserable for years.

If I sound like I'm being flippant, please forgive me – I'm really not. The outrage — some manufactured, some genuine — is not misplaced, but the shock that accompanied Facebook's announcement of its recent psychological experiment most definitely is.

My only real puzzlement over this experiment is that Facebook chose to reveal it publicly, without realising the bad feeling it would generate among its users. For a company that prides itself on gathering every gatherable bit of data on its users to develop in-depth profiles of their lives, it spectacularly failed to guess that those users would be unimpressed by it deciding to alter their mood.

Details of the experiment, conducted with Cornell University researchers, emerged over the weekend. Facebook revealed that it had altered the content of nearly 700,000 users' news feeds, deprioritising items of a particular emotional timbre — either happy or sad — over the course of a week to see what effect the changes would have on those users' moods.

According to the study, it turns out we tend to keep in tune with our social environment. If we see our friends are happy, we feel the same too; if they're having a bad day, it rubs off on how we feel as well.

Once the nature of the experiment hit the headlines, a wave of hand wringing followed as users and commentators complained that they felt they had been played; that Facebook was guilty of manipulating them without their knowledge.

Perhaps those decrying the site's actions have forgotten that this is pretty much how Facebook works. Facebook already adjusts what users see according to what its algorithms dictate — the friends we're most in contact with, our interests, the adverts we may respond to.

It regularly tries different layouts and algorithmic tweaks that affect our newsfeeds — it's just A/B testing designed to keep the service sticky, and make sure that Facebook gathers as much as data possible to better target ads. The difference between that and the 'emotional contagion' is that Facebook chose to make users aware of what it was doing.

The presence of university researchers gives this particular giant A/B test a patina of respectability and altruism that its other such experiments lack. While the results of its study may have proved useful to psychologists and other behavioural health experts, that isn't really Facebook's interest here. Remember, it was Facebook's own data scientists that designed the experiments, not Cornell's, who only got access to the results after the experiment concluded.

It's not been shy about expressing its own self-interest either: "The goal of all of our research at Facebook is to learn how to provide a better service," Adam Kramer, one of the data scientists involved in the experiment, wrote in a recent blogpost acknowledging the upset it caused.

Facebook didn't explicitly ask users for their consent to be involved in the experiment, as is regarded as best practice in research involving human subjects. It did get users to give their permission to be involved, however, although only explicitly noting in its data-usage policy that users' information could be used for research after the experiment was concluded. Cue the shock and outrage.

Facebook, however, maintains it already had permission to use data for such purposes. "When someone signs up for Facebook, we've always asked permission to use their information to provide and enhance the services we offer," a spokesperson for the company told Forbes.

"To suggest we conducted any corporate research without permission is complete fiction. Companies that want to improve their services use the information their customers provide, whether or not their privacy policy uses the word 'research' or not."

Again, I can't help but wonder how a company that spends so much time, money, and effort on garnering information on its users can yet understand so little about them, and how Facebook's research into its customers' moods left it misjudging the public's so badly. Even in acknowledging the negative reaction to the experiment, Facebook's spokespeople must tell us the fault lies with our understanding, not their handling, of its data-use policies.

Why is it that Facebook has been so tin-eared with regard to this particular experiment? Because it has been allowed to get away with having scant regard for users' wishes on how it handles their data for so long, why should this time be any different?

Facebook has made unpopular move after unpopular move when it comes to handling users' data — the facial recognition software farrago of a couple of years ago being a perfect example. Facebook users, of course, have the ability to opt out of such new features, but Facebook does tend not publicise the launch date of such new features, nor where users can disable them — generally tickboxes buried in the preferences section. The decision to tell users that their data could be used for research was handled very much in the same way.

Part of the motivation for the 'emotional contagion' experiment was "we were concerned that exposure to friends' negativity might lead people to avoid visiting Facebook", Kramer said. I strongly suspect that Facebook has learnt more than it could have hoped for on that account. Despite the outrage, again, about data-handing, did any of its users close their accounts?

The experiment is a clear demonstration that Facebook has learnt no lessons from its previous data-use foul-ups, and in not deserting the service after seeing their objections ignored, neither have its users. We are getting exactly the social network we deserve. If we want the company to change, there's only one way forward.

Read more on this story

Editorial standards