It was Wednesday night. Russell Holly, managing editor for commerce of our sister site, CNET, was beaming on Facebook about how he was able to get in to see a sneak preview of Dune, the Denis Villeneuve-directed film adaptation of Frank Herbert's 1965 sci-fi epic that is due for a late October release.
His post was accompanied by an iconic image of Kyle MacLachlan and Sting (yes, the lead vocalist from The Police) in the final scene of the 1984 David Lynch/Dino De Laurentis version, in which they duel with daggers -- to the death.
If you haven't seen the 1984 film, you definitely should, especially if you intend to see the new version. The sets, the cast (including Patrick Stewart, who would later play Captain Picard on Star Trek), and the soundtrack (composed by the band Toto) are fantastic.
While the movie flopped during its initial release, I have a huge place in my heart for Lynch's film. I plan to see the new adaptation, but the 1984 film will likely remain the definitive version for me.
I'm sure many other people are as excited as I am about this movie. So I quoted the duel scene in question, in which Sting, playing the charismatic and psychotic Feyd-Rautha Harkonnen, shouts, "I -WILL- kill you." I even put it in quotes so that there was no question I was quoting the film.
I thought nothing of it. I went about the rest of my evening. About an hour later, I was notified by Facebook that I was suspended for three days due to violating Community Standards.
I was shocked. Suspended for quoting a film? Without even using any obscenities? This seems… extreme.
Obviously, I had no intention of killing Russell Holly, envious as I was that he got to see this film months before anyone else. I am also not in the practice of murdering my editorial colleagues with poisoned daggers, as anyone at ZDNet will tell you.
Misinformation on Facebook is harmful to your health or even fatal if taken at face value
Quoting movies doesn't hurt or result in the death of anyone. But do you know what does? Spreading misinformation about vaccines and COVID-19. That absolutely will kill people.
How so? Many high-volume and extremely popular Pages on Facebook representing "Red" classified news websites (failing to meet NewsGuard basic standards of credibility and transparency) are spreading false information and are outright medically and scientifically inaccurate about COVID-19, vaccines, masks, 5G, and other health-related topics.
Many of these pages have tens of thousands of followers. When these pages are "liked" by Facebook users, other Pages that publish misinformation about these topics are recommended by Facebook's algorithm, sending users down a never-ending rabbit hole of meme-fueled hoaxes and conspiracies.
The more you click, the more Facebook recommends similar pages.
An algorithm that needs investigation and regulation
It's the algorithm at work -- the very same algorithm that decided that quoting Sting and Dune are bad and merit platform suspension, but vaccine and COVID-19 hoaxes are no problemo.
This problem identified by NewsGuard isn't just limited to recommending Pages; it's also a problem with recommending Groups specializing in vaccine and COVID-19 misinformation spread, some of which also have members totaling over 100,000.
Seems harmful to people, right? Not only should Facebook not be recommending these pages representing these websites and these extremist Group communities they are hosting -- which are posting harmful misinformation that could result in sickness, hospitalization, and death if followed -- but they shouldn't be allowing them on their privately run platform, period.
Post misinformation that potentially kills people? You get to stay, thrive, build your community, and spread even more bogus memes that people can die from if they consume and internalize your propaganda. Quote your favorite sci-fi film? Three days in the hole.
Welcome to the toxic reality of Facebook, a social media platform that is clearly out of control with its aggressive moderation algorithm and yet sees nothing wrong with allowing and recommending content from dangerous misinformation actors that put its nearly three billion members in harm's way.