It's only a few weeks before California's strict data law takes effect January 2020, and there are others in the pipeline in other states including proposals to jail executives that fail to protect people's data from a breach. Data is fast becoming a business liability rather than a benefit.
I go to a lot of media roundtables organized by computer security companies, and they all say that 100% protection is not possible and that every company has to prepare for a security breach. Yet that's assuming that the organization knows where all its data is located, which is unlikely.
Companies are constantly making copies of their production database for many reasons and especially for their developers so that they can to test software. These database copies often are not stripped of sensitive data because it is needed for testing. And the data is often used outside of the main IT environment and with different access controls. Developer testing setups are often easy entry points for intruders with nefarious intent.
If you don't need to store the data, why collect it and collect the legal liabilities?
Companies have been collecting mountains of personal data, but few of them have figured out what to do with it. They assume data is valuable -- like oil -- but like oil, it becomes a slippery thing to work with, and if you slip up and leak your data, you'll face massive fines and damage to brand reputation.
BIG DATA BIG LIABILITIES
The past couple of years the IT industry talked about the value that can be found in big data -- now it's better described as risky data. Companies will ask themselves, "What's the point of having it and risk losing it if we aren't using it?" The boards of companies and investors will likely recommend dumping any data that increases legal liabilities.
Unless the lobbyists for Google, Facebook, and other tech companies manage to persuade Washington to pass weak federal data laws to trump strict state laws.
RISKY AD TECH
There is currently a big clash of ideas around consumer data privacy and security. Politicians, the media, and the tech industry are trying to define the problem and how best to mitigate problem activities.
Data breaches by hackers are certainly a big problem, but there's a much bigger issue at play: Allowing ad technologies to collect and create massive data warehouses of highly sensitive personal data. If the personal data isn't there, there's nothing for hackers to steal. Ad tech makes it all possible.
Ad technology is in danger of being severely restricted. Marketers use people's data to save a few cents on the costs of selling services and products. This exact same data is used to judge people's beliefs and leave them exposed to hidden political and ideological manipulation by unknown parties.
WHY DOES YOUR SOAP POWDER NEED TO KNOW SO MUCH ABOUT YOU?
Procter & Gamble used to sell plenty of soap powder without using targeted data. Contextual advertising is very effective, and it doesn't require collecting personal information.
Contextual advertising enables media sites to reclaim their readers from the programmatic dashboards that steal them and follow them wherever they go. It would increase advertising revenues for publishers creating original content and allow them to reinvest rather than layoff people.
Over the short term, the winners will be Google and Facebook, because they don't have to sell people's data directly, and they can afford the costs of complying with any new legal regulations. They will essentially become vendors of metadata about personal data. That way they shelter ad clients from any data liabilities because they don't need to handle sensitive data themselves.
But over the long term, there are serious problems: Trying to persuade people to buy a product is the same as trying to persuade people to buy an idea -- advertising works and targeted messages work even better for political and ideological ends. It's why it's inevitable that societies will place strict limits on ad technologies and the use and collection of personal data, in my opinion.