Facebook flirting with the Queen of Diamonds

Facebook flirting with the Queen of Diamonds

Summary: What could it eventually mean to pass time with the social networking site?


In the 1959 Richard Condon novel The Manchurian Candidate, Sargeant Raymond Shaw begins the slide into his brainwashed alter-ego by being asked the simple question, “Why don't you pass the time by playing a little solitaire?”

When Shaw sees the Queen of Diamonds, he turns into an emotionless killer. The pawn in his communist handler’s desire to overthrow the U.S. government.

Queen of diamonds

Key to Shaw’s temporary transformation is the need to alter his emotions, to disable his reasoning and knowledge so he can be directed to the prescribed conclusion – in Shaw’s case, murder.

We can only hope that Facebook deosn't have anything that insidious in mind, although its latest privacy and emotions experiment controversy has some end-users threatening to kill their accounts.

Technically, according to Facebook, the 700,000 or so unwitting participants in the (virtual) social site’s (real world) social experiment in 2012 consented to be treated that way by agreeing to Facebook’s terms and conditions.

But what has really been exposed is more evidence that Facebook is still on its same old privacy and trust-less trajectory, which is explained away by its same old string of apologies as noted by Mike Isaac at the New York Times.

As the week wears on, Facebook’s Data Science team is taking in the criticism and says it has implemented stricter guidelines since the emotions study, although it has been noted that Facebook's data-use policy wasn’t updated to include “research” until after the emotions test. Ultimately, the team's actions will speak louder than their updates.

The argument here turns on ethics. Academic research is guided by ethics and the ability to establish trust. The American Sociological Association (ASA) has a five point Code of Ethics that implores sociologists “to aspire to the highest possible standards of conduct in research, teaching, practice, and service.”

It’s common practice that sociologists secure actual permission from those involved in their studies. In that light, Facebook’s check-box, implied consent model looks silly.

"I do think this whole incident will cause a lot of rethinking" about the relationship between business and academic researchers, Susan T. Fiske, the editor for the Facebook study and a professor of psychology and public affairs at Princeton University, told the Wall Street Journal.

The spotlight here is on Facebook, but don’t be fooled that they are operating solo. Yahoo, Microsoft, Twitter and Google among others are doing their own research on their own mega data sets. They’ve all been terms-and-conditions trusted in the past with some proven to have violated that trust. All these data sets hold great power and great responsibility for their stewards.

In addition to the ethics question, you have to look at the morals of Facebook’s actions. There is not a single healthy relationship on the planet based on the twisting, turning and manipulation of the other participant’s emotions. So it’s hard to believe Facebook, which is in the relationship business, didn't see this coming.  

“The reason we did this research is because we care about the emotional impact of Facebook and the people that use our product,” Facebook researcher Adam Kramer, who holds a doctorate in social psychology, said in his blog "apology" for the emotions experiment. “We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out.” 

That’s an understandable emotion for a company fearing opt-out.

It also begs the question, what was Facebook going to do with that knowledge, if anything? Send a box of Kleenex to the Facebook lonely hearts that don’t have awe-inspiring feeds? Tag as bullies those that do?

At some point in real and in virtual life you have to leave the nest and face the world, where things aren’t always pretty. Not even Facebook can make that go away by segmenting and tuning users for special-purposes or outcomes.

The worrisome part is that Facebook's findings hint at an ability to manipulate a person's emotions, making those people a bit happier or a bit more melancholy. With that ability used at key times what can you influence—a conversation, a movement, a government, a corporate IPO, a competitor, an election, or perhaps just a Privacy Policy update?

Could Facebook deliver the marketers version of Raymond Shaw? An un-natural born consumer. The voter’s version? The retailer’s version? The Zuck version?

Are we talking sentiment analysis or sentiment manipulation? The latter gets you to the desired conclusion more efficiently and more closely aligned with strategy. Raymond Shaw’s platoon mates were brainwashed themselves to regard him as “the kindest, bravest, warmest, most wonderful human being I've ever known in my life.”

Maybe it’s no coincidence that the Manchurian Candidate has its own Facebook page. 

What might it mean in the future to ask “Why don't you pass the time by doing a little Facebooking?”

Topics: Privacy, Social Enterprise


John Fontana is a journalist focusing on authentication, identity, privacy and security issues. Currently, he is the Identity Evangelist for strong authentication vendor Yubico, where he also blogs about industry issues and standards work, including the FIDO Alliance.

Kick off your day with ZDNet's daily email newsletter. It's the freshest tech news and opinion, served hot. Get it.


Log in or register to join the discussion
  • What facebook will do . . .

    "It also begs the question, what was Facebook going to do with that knowledge, if anything? Send a box of Kleenex to the Facebook lonely hearts that don’t have awe-inspiring feeds? Tag as bullies those that do?"

    It's known by now that what you see on your wall isn't 100% of everything; Facebook actually filters it and shows what it thinks is most relevant. There may actually be friends you are no longer hearing from anymore. I'm guessing this research will end up affecting the filters they use to filter out posts in another move they made: Carving out your own little reality for you under the guise of "well, you wouldn't be able to keep up if we let you see everything."
  • What Will They Do?

    The study altered news updates -- feeding mostly good or mostly bad updates to different users. The result is Facebook discovered it could either put folks in positive or negative moods. This is powerful stuff because it means Facebook has the ability to alter individuals moods and connect those modifications with business partners. When someone feels bad, there's one set of business partners who are best to exploit that feeling. When someone feels good, there's another set of business partners who are best at exploiting that feeling.

    It's very powerful to have an intimate profile on someone based on years of on-line interactions with their friends, lovers and family. Business partners will pay big bucks to target specific individuals whose wants and needs fit their products. Got it? Good. On top of that, how much more powerful is it for Facebook to change the timing of their news updates to alter their mood for these same business partners? I don't know about you but it makes the hair stand up on the back of my neck. Not only are you able to stimulate me with business partners who create products and services that are a natural fit for my personal tastes but you can alter my mood to potentially enhance my vulnerability to purchasing their products and services. Powerful? You bet it is...
    • Ads, ads, ads, what else?

      It's really strange to me how the author didn't explore this advertisers manipulation angle. Glad you did it, nice comment.