X
Tech

Facebook flirting with the Queen of Diamonds

What could it eventually mean to pass time with the social networking site?
Written by John Fontana, Contributor

In the 1959 Richard Condon novel The Manchurian Candidate, Sargeant Raymond Shaw begins the slide into his brainwashed alter-ego by being asked the simple question, “Why don't you pass the time by playing a little solitaire?”

When Shaw sees the Queen of Diamonds, he turns into an emotionless killer. The pawn in his communist handler’s desire to overthrow the U.S. government.

Queen of diamonds

Key to Shaw’s temporary transformation is the need to alter his emotions, to disable his reasoning and knowledge so he can be directed to the prescribed conclusion – in Shaw’s case, murder.

We can only hope that Facebook deosn't have anything that insidious in mind, although its latest privacy and emotions experiment controversy has some end-users threatening to kill their accounts.

Technically, according to Facebook, the 700,000 or so unwitting participants in the (virtual) social site’s (real world) social experiment in 2012 consented to be treated that way by agreeing to Facebook’s terms and conditions.

But what has really been exposed is more evidence that Facebook is still on its same old privacy and trust-less trajectory, which is explained away by its same old string of apologies as noted by Mike Isaac at the New York Times.

As the week wears on, Facebook’s Data Science team is taking in the criticism and says it has implemented stricter guidelines since the emotions study, although it has been noted that Facebook's data-use policy wasn’t updated to include “research” until after the emotions test. Ultimately, the team's actions will speak louder than their updates.

The argument here turns on ethics. Academic research is guided by ethics and the ability to establish trust. The American Sociological Association (ASA) has a five point Code of Ethics that implores sociologists “to aspire to the highest possible standards of conduct in research, teaching, practice, and service.”

It’s common practice that sociologists secure actual permission from those involved in their studies. In that light, Facebook’s check-box, implied consent model looks silly.

"I do think this whole incident will cause a lot of rethinking" about the relationship between business and academic researchers, Susan T. Fiske, the editor for the Facebook study and a professor of psychology and public affairs at Princeton University, told the Wall Street Journal.

The spotlight here is on Facebook, but don’t be fooled that they are operating solo. Yahoo, Microsoft, Twitter and Google among others are doing their own research on their own mega data sets. They’ve all been terms-and-conditions trusted in the past with some proven to have violated that trust. All these data sets hold great power and great responsibility for their stewards.

In addition to the ethics question, you have to look at the morals of Facebook’s actions. There is not a single healthy relationship on the planet based on the twisting, turning and manipulation of the other participant’s emotions. So it’s hard to believe Facebook, which is in the relationship business, didn't see this coming.  

“The reason we did this research is because we care about the emotional impact of Facebook and the people that use our product,” Facebook researcher Adam Kramer, who holds a doctorate in social psychology, said in his blog "apology" for the emotions experiment. “We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out.” 

That’s an understandable emotion for a company fearing opt-out.

It also begs the question, what was Facebook going to do with that knowledge, if anything? Send a box of Kleenex to the Facebook lonely hearts that don’t have awe-inspiring feeds? Tag as bullies those that do?

At some point in real and in virtual life you have to leave the nest and face the world, where things aren’t always pretty. Not even Facebook can make that go away by segmenting and tuning users for special-purposes or outcomes.

The worrisome part is that Facebook's findings hint at an ability to manipulate a person's emotions, making those people a bit happier or a bit more melancholy. With that ability used at key times what can you influence—a conversation, a movement, a government, a corporate IPO, a competitor, an election, or perhaps just a Privacy Policy update?

Could Facebook deliver the marketers version of Raymond Shaw? An un-natural born consumer. The voter’s version? The retailer’s version? The Zuck version?

Are we talking sentiment analysis or sentiment manipulation? The latter gets you to the desired conclusion more efficiently and more closely aligned with strategy. Raymond Shaw’s platoon mates were brainwashed themselves to regard him as “the kindest, bravest, warmest, most wonderful human being I've ever known in my life.”

Maybe it’s no coincidence that the Manchurian Candidate has its own Facebook page. 

What might it mean in the future to ask “Why don't you pass the time by doing a little Facebooking?”

Editorial standards