Leaving it to users and businesses to sort out the protection of big personal data has failed, and with some real world consequences coming to light, the pressure will be on for governments to do something. History has shown that the problem with this approach is lawmakers often go with the first something that comes up for voting, rather than taking the best approach possible.
If any proof was needed that the data store at the middle of the Facebook and Cambridge Analytica furore is far from being an isolated incident, it was delivered in spades on Wednesday.
ZDNet's Zack Whittaker revealed that Localblox, a firm in the US state of Washington, had scraped together 48 million personal profiles from services such as Facebook, LinkedIn, Twitter, and Zillow without user consent. The kicker to the story was the data was left sitting in an AWS S3 storage bucket without a password, because in 2018, of course it was.
To be sure, there are many, many other data stores sitting in databases around the web full of data that belongs to users, but we only hear about the ones that bad things happen to.
A recent survey by Australian analyst firm Telsyte found that 38 percent of its respondents were "struggling with the ethical use of customer data", and concerns were expressed that customers might have shared information without understanding what they had consented to.
The survey found 53 percent of respondents said they were "willing to share data if there is tangible benefit to their organisation".
That last number is the one that should really start the alarm bells for those who care about trying to limit the big data and privacy-smashing monster that has been created over recent years.
If more than half of respondents are comfortable enough admitting on a survey that they share data that does not belong to them, then it is fair to assume the real number is much, much higher.
Somewhere among the rise of social media and the uptake of smartphones, users were trained to disregard dialogs informing them their data would be captured to use a service or app.
As someone who was once involved in the personal data slurping industry, the amount of data that people will hand over simply to play a silly game on Facebook is breathtaking. Do the right thing and reduce the data you want users to handle over, and the metric needle is likely to take a dip downward as users see a different dialog than the usual "gain as much data as you can by default" prompt and baulk.
Thanks to the dark interface practices of Facebook, this was life at the start of the decade.
One of the shocking, and brilliant, aspects to the European General Data Protection Regulation set to come into force next month is that it reminds businesses who truly owns the data they have collected -- the users.
While the proverbial big data horse bolted a long time ago -- and the likes of Facebook are continuing their dark ways to ensure the horse stays running by moving its users out from under the guise of the GDPR where possible -- there is an opportunity to at least try to restrict its movement.
Even Mark Zuckerberg can see where the puck is heading, as he attempted to steer lawmakers at his recent appearance on Capitol Hill.
It was not a good sign that instead of taking to the Facebook founder with baseball bats, the lawmakers seemed to be conciliatory and wanting to work in concert with the social network on regulations. This is akin to asking John D Rockefeller or JP Morgan to help with drafting competition laws -- at best, they will be insipid, and at worst, they will entrench Facebook and the tech giants as data bastions into the future.
To give a company that tracks people not registered with its service an opportunity to form policy is an abdication of responsibility by lawmakers, and shows a failure to grasp the gravity of the issue.
Facebook is a behemoth, and it is not going anywhere. This is a company that pulls in $41 billion of yearly revenue, and made $16 billion in profit last year. In a hypothetical world where it was possible to force Facebook to delete all of its data, the personal data copied from it by developers, marketers, and researchers is already out, analysed, and being traded as insights across the globe.
There is little that can be done about getting back the personal data that has already been handed over and is squirrelled away in analytic engines, so the focus needs to be on limiting the damage.
Regulation that returns as much power as possible to users is what we should be looking at, but instead it is far more likely the decision will come down to how much water to put on the wet lettuce used to slap companies that are caught out.
The GDPR is a good start for Europe, but the rest of us need something equivalent or better.
ZDNET'S MONDAY MORNING OPENER
The Monday Morning Opener is our opening salvo for the week in tech. Since we run a global site, this editorial publishes on Monday at 8:00am AEST in Sydney, Australia, which is 6:00pm Eastern Time on Sunday in the US. It is written by a member of ZDNet's global editorial board, which is comprised of our lead editors across Asia, Australia, Europe, and the US.
PREVIOUSLY ON MONDAY MORNING OPENER:
- It's time to tame Big Tech: Here's how we get started
- Enterprises learning to love cloud lock-in too: Is it different this time?
- Improve your cybersecurity strategy: Do these 2 things
- Apple's education event has elephant in the room named Google
- The Raspberry Pi is the feel-good tech success that we all need
- Sayonara to the best phone fingerprint sensor in the business
- 3 ways the 'smart office' will change the future of work
- MWC 2018: Five big questions about the future of smartphones
- Excessively cheesy and tactless Falcon Heavy stunt fills space vacuum
- 3 ways technology will revolutionize transportation in the next decade
- Apple HomePod: Late, and pricey, but this smart speaker could still have one advantage over its rivals