​Facebook's fake news efforts will fail: Here's why

Facebook has a plan to battle misinformation, but humans love the filter bubble and no algorithm is going to change that fact.

Facebook outlined its plans and strategy to combat fake news and misinformation, but the effort is likely to flop because new hires, machine learning and artificial intelligence can only do so much.

After all, we humans just love the comfort of our filter bubbles. And Facebook is designed for the filter bubble.

For those who are unfamiliar with the Filter Bubble, a concept cooked up by Eli Pariser and outlined in a 2011 TED Talk, the general idea is that as Web companies personalize services we don't get exposed to information that may challenge our world view. You could almost replace Filter with Facebook these days.

Enter Facebook's master plan to combat fake news and misinformation: Remove accounts violating policies, reduce fake news and inform the public with additional context.

Meh. Color me pessimistic that Facebook will get too far. Here's why:

Humans like the filter bubble. Who doesn't want a set of friends who always agree with you? As Facebook personalizes your feed to your tastes and you exclude those people from the other side of whatever position you have the echo chamber only gets louder. When the history of social media is written one chapter will revolve around how Facebook scaled the filter bubble. No algorithm, Facebook "editor," or machine learning technology is going to keep humans from finding like-minded folks.

Facebook's fake account crackdown: Our AI spots nudity, hate, terror before you do | Data firm leaks 48 million user profiles it scraped from Facebook, LinkedIn, others | Mark Zuckerberg: It's not hard to align Facebook's interests with user interests

Architecture. Facebook is set up to connect friends--even if some of them may be viewed as nutty. Facebook would upend its entire business if it tried to totally eradicate misinformation. Misinformation is part of daily life. It's called gossip. It's called glossing over the truth. It's called flat-out lying. Facebook only enables you to scale the BS.

People sometimes aren't so bright. The most stunning thing about the recent drop of misinformation ads on Facebook from Russia with love is that anyone fell for them. I get that news literacy is needed just like financial literacy (and add data to that mix too), but it takes a bit of effort to see how those ads moved votes. If an alien race landed on Earth and read Facebook to get a feel for the human race the death ray might be cocked in about 30 minutes.

Another key question: Is it Facebook's job to make you smarter, teach you news literacy and encourage you to go to source material?

Facebook wants to be a platform when it's also a media company. Hats off to Facebook for trying to do something and not be regulated into oblivion in the U.S. and EU, but hiring thousands of people and developing its AI and machine learning game isn't likely to overcome reality. Facebook needs to be more media company and curate the information coursing through the platform and may even have to upend free speech. That level of editing is likely to diminish engagement and ad revenue.

More: