X
Innovation

Facebook needs more real solutions to fake news

After banning dedicated fake news sites, Facebook must still explore alternatives.
Written by Ross Rubin, Contributor
Politics and Facebook

Facebook became a primary platform for political influence in 2016.

With Hillary Clinton having been favored by most pollsters to win the presidential election, there has been no shortage of soul searching and finger pointing by her backers. One target of the blame game has been Facebook, which Clinton supporters have accused of keeping Trump supporters in an echo chamber of partisan and fake news.

These news items may have been driven by politics or simply profit by unscrupulous clickbaiters. But whatever the reasons behind them, fake news is a big problem that Facebook needs to address.

Mark Zuckerberg's reaction so far has been mixed. On one hand, noting that 99 percent of the news posted on Facebook is genuine, he has expressed doubt that misleading news items tipped the election (despite how slim the margin of victory was in some states).

On the other hand, he believes the company must do more to ferret out fake news stories including the outright banning of dedicated fake news sites. Still, while the Facebook CEO admits that some stories are just outright fabrications, others have some elements of truth or are subject to interpretation.

Facebook says it is testing different ways to identify these. Not surprisingly, some of these leverage the kind of investment in artificial intelligence that every major ecosystem provider is investing in these days. Whether driven by AI or not, though, there are a number of ways Facebook might indicate to its users that stories are false -- or at least to get them to consider if they're skewed.

Verified News Sources. Facebook, of course, already verifies major news outlets along with many other entities, but not necessarily the stories that are shared from them. By placing some kind of blue check mark right in the story links, it could help recognize the credibility of sources. Facebook could also include ratings of a media outlet or story based on the aggregate political leanings of those who share it. For example, Fox News stories might be shared more often by Republicans than Democrats and the reverse for MSNBC stories. These behavioral cues could be indicated via some kind of meter or keyword identification.

Side-by-Side Alternatives. One of the alleged reasons it was so difficult to debunk fake news stories on Facebook was that their shared distribution overwhelmed the truth-telling by long-running debunking sites such as snopes.com. By putting alternative or conflicting viewpoints alongside shared articles, Facebook can offer a more balanced perspective.

Semantic Comment Analysis. Of course, many Facebook users aren't shy about sharing their views. It's often difficult, however, to separate sentiment about a story from its source. You may have experienced something like this when a friend has exuded the positive qualities of a relative who has recently passed on. It is difficult to "Like" the tribute without conveying that you didn't like the news itself. Facebook finally dealt with this paradox recently by adding other options for reacting to posts, including "Angry" and "Sad." By analyzing comments that post to credible sources debunking a story, it could score stories and remove or place warnings around those that reached a certain level of dubiousness.

Source Cross-Checking. This one is pretty simple. If, say, a news story reports that Pope Francis has endorsed a candidate, it isn't too difficult to search the web for corroborating stories from other reputable sources that would obviously pick up on such an interaction between two such high-profile figures, the way a skeptical human might. Also, for National Enquirer stories, it should be pretty easy to teach an AI that humans just don't have alien babies. Ever.

Virtually any system for ascertaining the truth can be gamed, especially when statements are open to multiple interpretation or if candidates appear to contradict themselves as they often do. But none will be effective if people insist on believing what they want even in the face of overwhelming evidence to the contrary. As is the case for consumers of any media, credibility ultimately comes down not to what is delivered to the ears, but how it's perceived between them.

Editorial standards