Facebook: Unethical, untrustworthy, and now downright harmful

Facebook: Unethical, untrustworthy, and now downright harmful

Summary: News of Facebook experimenting on its users' emotional states has rattled everyone. Worse, the tool used to perform the experiments is so flawed there's no way of knowing if users were harmed.

SHARE:
TOPICS: Big Data, Security
185
facebook manipulation study

If there's one thing we've learned from zombie movies, it's that when the word "contagion" is associated with humans getting experimented on without their knowledge at the hands of a cold, massive corporation -- things never end well.

On June 2, the Proceedings of the National Academy of Sciences published "Experimental evidence of massive-scale emotional contagion through social networks." It made headlines last weekend, which can be succinctly described as a 'massive scale contagion' of fury and disgust.

In "Experimental evidence" Facebook tampered with the emotional well-being of 689,003 unknowing users to see how emotional contagion could be controlled; basically, how to spread, or avoid the spread of, its users' feelings en masse.

It's hard to fathom how far things have gone at Facebook that researchers feel entitled to experiment on the emotional state of its users.

Everyone except the people who worked on "Experimental evidence" agree that what Facebook did was unethical. In fact, it's gone from toxic pit of ethical bankruptcy to unmitigated disaster in just a matter of days.

Cornell University is now distancing itself from involvement in "Experimental evidence." Facebook appears to have been caught changing its Terms to include "research" after the work had been done. The study is now being called into question over approval-laundering by respected academics.

It's not going to get any better when people take a look at the tool Facebook used to do its experiments -- a tool so woefully wrong for the job that no one, including Facebook, will ever know what Facebook actually did to its users' emotional health.

For all of its work thus far studying emotional contagion, Facebook has used the Linguistic Inquiry Word Count (LIWC2007) tool, though "Experimental evidence" was the first time Facebook used the tool to actively interfere with its users.

LIWC 2007 -- note that date -- was conceived to provide a method for studying the "various emotional, cognitive, structural, and process components present in individuals' verbal and written speech samples."

The tool was created for analyzing long form blocks of text, like a book, research paper, therapy transcripts, etc.

Facebook actually doesn't know what it did to half a million people

The Base Rates of Word Usage that LIWC is based on (all prior to 2007) include American and British novels, "113 highly technical articles in the journal Science published in 1997 or 2007," text from studies, random writing assignments, and,

A fourth sample [which] included 714,000 internet web logs, or blogs, from approximately 20,000 individuals who posted either on Blog.com in 2004 or LiveJournal.com in the summer and fall of 2001.

LIWC 2007 fails when it comes to short bursts -- especially with double negatives.

The LIWC2007 website offers free use of the tool. Below is an example of a negative status update ("I am not having a great day"), yet the tool failed to account for positive and negative words within the same sentence.

Facebook experiment

It's hard to fathom how far things have gone at Facebook that researchers feel entitled to experiment on the emotional state of its users -- the removal of content -- with LIWC2007.

With the lack of post-experiment interview or debriefing, there is no way of knowing exactly what Facebook did to the emotional temperature of over half a million people.

Conspicuously different from Facebook's other emotional contagion studies

"Experimental evidence" was the third time Facebook studied its users' emotional contagion without their knowledge -- thought it is the first known time Facebook has experimented with controlling the emotions of its users.

Almost four months before "Experimental evidence" appeared in PNAS and started making the neuroscience rounds on Twitter last week, "Detecting Emotional Contagion in Massive Social Networks" was published March 12, 2014 in PLOS ONE, an international, peer-reviewed, online scientific journal for reports on primary research. 

"Detecting Emotional Contagion" was a product of UC San Diego and Yale, with Facebook employees Adam Kramer and Cameron Marlow.

"Experimental evidence" was a product of UCSF's Center for Tobacco Control Research and Education, Cornell University, and Facebook's Adam Kramer, who is listed as the paper's primary contact. The primary contact for the first study is UC San Diego's James H. Fowler.

Marlow was thanked in "Experimental evidence" -- the study which became Facebook's foray into contagion experimentation -- and was co-author on a preceding study, "Structural diversity in social contagion" (October 6, 2011, also in conjunction with Cornell and UCSD), the earliest of the studies, which did not include Facebook's Adam Kramer.

"Detecting Emotional Contagion" ran for 1180 days from January 2009 to March 2012. "The study was approved by and carried out under the guidelines of the Institutional Review Board at the University of California, San Diego, which waived the need for participant consent."

According to "Experimental evidence" researchers, user consent was not necessary because Facebook's Terms stood as agreement -- specifically the word "research" indicated that users agreed to the emotional manipulation experiment because they had clicked "agree" when signing up, or by continuing to use the site after ToS updates.

Whereas "Experimental evidence" hid both positive and negative posts from friends, colleagues and family from users to see if it changed the way users influenced each other's feelings, "Detecting Emotional Contagion" instead examined external influences on users to see if users simply influenced each other's feelings by a naturally occurring, impossible to manipulate occurrence: The rain.

Here, we elaborate a novel method for measuring the contagion of emotional expression. With data from millions of Facebook users, we show that rainfall directly influences the emotional content of their status messages, and it also affects the status messages of friends in other cities who are not experiencing rainfall. For every one person affected directly, rainfall alters the emotional expression of about one to two other people, suggesting that online social networks may magnify the intensity of global emotional synchrony.

UCSD's "Detecting" study noted, "Importantly, rainfall is unlikely to be causally affected by human emotional states, so if we find a relationship it suggests that rainfall influences emotional expression and not vice versa."

Instead of changing the user’s emotion directly with an experimental treatment, we let rainfall do the work for us by measuring how much the rain-induced change in a user’s expression predicts changes in the user’s friends’ expression.

Facebook's "Experimental evidence" hypothesis amounted to "let's see if we can plant unhappiness and make it spread." The hypothesis was tested on a large group of people -- and their networks -- that couldn't consent to the experiment, and had no way to actually track whatever impact it had on people's lives.

Facebook, once again, did what it's good at: tracking us, failing to get consent, and avoiding accountability.

Adam Kramer -- who worked on both studies -- posted a non-apology to Facebook that utterly missed the point, saying they were sorry about the way they had described the experiment while attempting to re-frame the concept of user consent as if it were a formality.

In classic Facebook style, he blamed users for being upset, as if news of emotional tampering in people's day-to-day lives was simply a misunderstanding that only anxious people worried about.

I can understand why some people have concerns about it, and my co-authors and I are very sorry for the way the paper described the research and any anxiety it caused.

As if Facebook's other, massive studies on emotional contagion never happened, he said:

We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out.

Ludicrously, he added, "At the same time, we were concerned that exposure to friends' negativity might lead people to avoid visiting Facebook."

Special Feature

Going Deep on Big Data

Going Deep on Big Data

Big data is transitioning from one of the most hyped and anticipated tech trends of recent years into one of the biggest challenges that IT is now trying to wrestle and harness. We examine the technologies and best practices for taking advantage of big data and provide a look at organizations that are putting it to good use.

However, he admitted that the firm did not "clearly state our motivations in the paper."

Emotional manipulation is such a strangely intimate place to discover you're the subject of surveiilance-cum-manipulation, that even your unguarded moments of sharing feelings are subject to someone trying to get something out of you.

We want to call into account what makes this system of control possible, but if Cornell is any example of what to expect from the fallout, no one is going to be held accountable for companies like Facebook recklessly endangering users -- yet again. For those of us observing this spectacle in a sort-of state of self-aware, displaced horror reserved for those moments when life and sci-fi dystopia cross shadows, it has never been more clear that Facebook's ideas about organizing society are wholly broken. 

Intentionally doing things to make people unhappy in their intimate networks isn't something to screw around with -- especially with outdated and unsuitable tools. 

It's dangerous, and Facebook has no way of knowing it didn't inflict real harm on its users.

We knew we couldn't trust Facebook, but this is something else entirely.

Topics: Big Data, Security

Kick off your day with ZDNet's daily email newsletter. It's the freshest tech news and opinion, served hot. Get it.

Talkback

185 comments
Log in or register to join the discussion
  • This article

    This article brought to you by Google, the makers of Google+
    Buster Friendly
    • Can't find a reference to Google in her biography

      Maybe you have inside knowledge ;)
      AleMartin
      • Joke

        It was a joke.
        Buster Friendly
        • Maybe, but . . .

          However bad Facebook is, Google is worse.
          Henry 3 Dogg
          • Re: However bad Facebook is, Google is worse.

            How so?
            BoxOfParts
          • Google

            Just read the Terms of Service and Privacy Policies. You do know Google reads all of your email and all of your uploaded data and has a right to do with it whatever they want. They stopped claiming that they own all of your information, but you have to give them the right to do things with you in order to use their sites.
            hforman9
          • Not any different than any other ISP...

            They ALL read your mail.
            jessepollard
          • _Very_ different than _any_ other ISP!

            Facebook bargains the lowest rate for third world labor in the entire industry, the people who review your complaints and reports are paid $1/h, Google reportedly pays $10/h for security and privacy roles. That is how and why Google will remove reported porn and abusive material within a few hours, and child pornography goes viral for almost 2 days on Facebook. Three teens killed themselves in the past year and a half because Facebook underpaid police can't remove outrageously abusive bullying videos quickly enough.

            Only Facebook is a monarchical system, Zuckerberg corporate setup is unprecedented in US corporate history, setup that grants him 65% of the board voting power while owning only 30% of the stock. In a public company, mind you. When one man helds 65& of the vote, there is no vote, it's a monarchy. To compare, it takes 2 founder and a CEO at Google to get to 65% voting power.

            Google does not confuse pornography and breast feeding women, Facebook does.

            Google does not put ads on porn and women abuse material, Facebook does.

            Google does not decide one morning that you cannot message your friends anymore without to download an app that required you to grant MIND BLOWING privacy access, much more invasive than the original app. Facebook does.

            Google does not play with your messages to manipulate feelings and their propagation in your networks through your messages, Zuckerberg does.

            Facebook is taking over where Jobs left it, and is becoming the most insidious disease for digital consumers, a title Apple helds until recently.

            But you know what, almost every single attorney general in this country petitioned the congress the lower the immunity granted to ISPs by Section 230 of the Communication Decency Act, and guess why... They are going to sue the crap out of company gone rogue under the cover of impunity.

            I bet you Facebook will be gone with a mind blowing load of lawsuits by end of 2015.
            flexengineer
          • Who are these speed readers...

            Really, tell us how many speed readers it would take to read all our emails sent through Google. They tell you that key words are picked out by a computer to adjust advertising, which in turn will make Google ads more valued by clients. They are not reading your boring mail. I have been online since 1994 and always assumed someone could intercept any email sent -- this is common knowledge dating back to day one of email.
            mytake4this
          • If anyone uses FB, he is a fool

            As such, he deserves what comes to him.
            Uralbas
          • Facebook issues are well known

            Facebook does not care about your privacy, well being or vague concept of human rights. If you sign up for their "free" service, then you should expect them to use your information in any way that profits the corporation. Sadly there is no ethical alternative in mass social media.
            RalphEllis
          • How so?

            Your kidding right?

            What a world we live in.

            Just in case you missed it, Googles current plan, already well under way from the sound of it is to basically monitor us 24/7. Even our physical being. We know what Google does with meta data. Next they will come up with a headset that reads our thoughts Im sure. This whole thing is beyond ridiculous.

            Wake up folks.
            Cayble
          • yes, they are probably in your basement

            Are they in your basement or attic? This headset made of aluminum by any chance? Just a snark - sorry...

            Seriously, I actually get what you are saying. Some things Google are a tad bit spooky or creepy, until you realize you are still in control. On your Google phone, simply switch off Google Now, if it bothers you -- it doesn't work for everyone. As for ad matching, when you use a discount card at the grocery store, or department store, they match up coupons set to what you purchased.

            The 24/7 monitor is something which Google Now and other projects may be shooting for -- but doesn't Google tell you of this? I mean to say, if you tell them about a fight, or when you want to start a newspaper subscription, or put out the cat, you are sharing info so that in return they can send reminders, or help info. about what you are about to do. Nothing too disconcerting now, is it? It is your choice. Turn off the phone or PC -- disable Google Now if it is upsetting, or just not useful. Sometimes it is not. Every person has different wants and needs, and NO company, service or project is always a match.
            mytake4this
          • yr kidding right?

            turning off yr equipment and thinking its impossible to be monitored then, is somewhat silly... this social media is not a product for we the consumers, its datamining tool, we are the products....thats why its 'free of charge' when nothing in this world is free of charge, we are paying with our information and no power switch can stop that....... and all the s.m.a.r.t. stuff is designed to monitor, hence SMART, Self Monitoring And Reporting Technology...
            Oiram6
        • Joke or miss-direction

          away from the focus of the article?

          Microsoft is in deep with partnership Facebook and I notice the MS loyalist seemingly using miss-direction when things like this pop up. Just an observation....
          BoxOfParts
          • Here's another observation

            Microsoft isn't in deep partnership with Facebook, they invested early for a total of 1.6%! ( do a search)

            They then made a nice profit selling off 20% so if owning 1% of a company's stock is "deep partnership", then what would you call other's investments much higher then 1%?

            I did notice that some of the anti-MS loyalist are seemingly using miss-direction when things like this pop up, implying as to make it like MS is in cahoots with Facebook on this, or running the show themselves.

            But hey, it's just an observation....
            William.Farrel
          • MS loyalists???

            You r extraordinarily PATHETIC BoxOfParts. Brutally so. I notice Google apologists like you looking to drag Microsoft into everything that exists and then work your ridiculous butt off to turn it into some Microsoft crisis.

            I use Windows on all my computers, I used to have an iPhone which was great but I now have a WP* which I feel is better.

            Does that make me a Microsoft loyalist? Maybe. I see myself as a "what works best for me" loyalist.

            And I find no particular good in Facebook. Its living proof that "if its free" its cost are simply a lot higher than you know about. I have thought a great deal of Facebook and its operation stinks for a long time now.
            Cayble
      • Buster Friendly

        Has a deep and rather worrying hatred of Google that he can't contain.
        Most normal folk would stop and just not using them..........
        He has a form of Google tourette syndrome!
        Boothy_p
        • They are a slime ball company

          Well, they are a slime ball company. The Chrome browser malware hidden in the flash and acrobat downloads pretty much proved that. The sneak signing you up for Google+ was just as bad.
          Buster Friendly
          • Who Pays

            Quite simply, you either pay for a product or you are the product. Sometimes it takes a little of both to keep the lights on and pay the bills.
            MichaelInMA