Facebook just can't catch a break -- not that many think it should.
It's been almost a month since the Cambridge Analytica story first broke. Since then, there's been a firehose of bad news about Facebook's privacy and data practices that's called into question if the tech giant can act with its users' (and its shareholders') best interests at heart.
Read also: Trump-linked data firm Cambridge Analytica harvested data on 50 million Facebook profiles to help target voters
This week, Facebook admitted that as many as 87 million users may have had their data "improperly shared" with the voter profiling data analytics firm Cambridge Analytica.
Sheryl Sandberg, chief operating officer, has been doing the fire-fighting rounds, trying to disaster-manage the scandal. She did a lengthy interview with NBC News, where she described how the company "should" have come clean to its users about the mass data exfiltration, and it "should" have checked that Cambridge Analytica deleted the collected data -- when, in fact, it didn't. And then, she said on BuzzFeed that the series of crises that Facebook's had to handle during the past month, costing the company over $80 billion in value, are her "responsibility."
But every time Sandberg sandbags one corner of the company's collapsing reputation, another load-bearing wall crumbles down.
Read also: Data breach exposes Cambridge Analytica's data mining tools
The latest Facebook dust-up landed Thursday when TechCrunch revealed that messages sent by the social network's co-founder, Zuckerberg, and other senior executives were remotely deleted by the company. Not just in his inbox -- but the inboxes of his recipients. Anyone who previously received a message from the company co-founder looked, for all intents and purposes, like they were talking to themselves.
Facebook said it was an effort, in the wake of the Sony hack in 2014, to "protect our executives' communications" from breaches, by reducing their message's retention dates.
But it wasn't a one-off incident.
Read also: How Cambridge Analytica used your Facebook data to help elect Trump
It's the latest in a string of examples of Facebook's lack of regard for its users. BuzzFeed described it best: Facebook has a "two-tier privacy system" that favors its leaders and executives.
The rest of us can, in other words, go to hell.
What's clear is that there's a trend of Facebook and its executives distancing themselves from facing up to their users and taking responsibility for their mistakes. Facebook isn't even trying to get ahead of the story -- or stories, as the scandal keeps getting bigger -- and only acts when it's caught with its hand in the cookie jar. And, even then, the company is only slapping a Band-Aid on to save face amid pressure from governments and shareholders -- the only two things that Facebook is vulnerable to.
What better way to show how little the company cares about its users' privacy than by acting only when it gets caught.
Read also: Cambridge Analytica: The future of political data is in the enterprise
Here's a short list, in case you forgot:
Facebook vice-president Andrew "Boz" Bosworth said in a controversial, highly offensive internal memo that the company's obsession with growth "is justified," even its "questionable contact importing practices" and the "subtle language" it uses to help people stay searchable by friends. "Maybe it costs a life by exposing someone to bullies. Maybe someone dies in a terrorist attack coordinated on our tools," he wrote.
We know about the since-deleted memo because it leaked.
Boz was widely derided by his colleagues -- close to 3,000 of them -- who responded with anger, frustration, and the "sad" reaction button. Some were more frustrated by the leaks that led to the memo's release in the first place, which we only know because those complaints about leaks were also leaked.
And then, this week, in an interview with Reuters, Zuckerberg would not commit to giving Facebook's users in the US the benefits from the incoming GDPR, the EU's new privacy and data protection law. When asked about the story in a press call the next day, Zuckerberg said he was "somewhat surprised" by the report -- because he said that his "answer was yes."
CNET: 6 ways to delete yourself from the internet | The best password managers of 2018 | The Best VPN services for 2018
Then, Facebook was caught out in another privacy brouhaha after users flocking away from the site found that the company had stored videos that users had uploaded and deleted or hadn't posted. The company called it a "bug," an excuse that the company took a week to think about first.
Think about it: We likely wouldn't even be here if there wasn't a former Cambridge Analytica employee turned whistleblower in the first place. Sandberg defended the company's decision not to tell its users of the breach two years earlier because "we thought the data had been deleted, that's why."
And what better example of the line in the sand of "us vs. them" than the fact that users are not even able to block Zuckerberg on the site that he himself helped to build.
"Even if he were harassing you... the gods sit on their pedestals," said Jillian York, director for international freedom of expression at the EFF, in a tweet last year.
Read also: Election tech: The truth about the impact of political big data
Facebook's senior management, including Zuckerberg, are so used to having the ball in their court that the last month's string of scandals has sent the company into panic mode.
Now the company is trying to claw back whatever trust its users have left -- which isn't much -- by promising in heartwarming (sarcasm) interviews and meaningful (still sarcasm) statements that it'll change and do better and try harder in the future.
When Zuckerberg testifies to lawmakers next week about how Facebook "values your privacy," don't forget all the times the company's executive suite put themselves first and sold your data down the river.