I first called upon businesses to implement "Big Privacy" in 2014. At that time my main concern was that the law had not kept pace with technology; that digital innovators were free to do as they desired with the data they collected. The data collection landscape, while complex, looked nothing like it looks today.
In just the last year investment in AI exploded: rising from 25 percent to nearly 70 percent of early adopters (according to the Constellation 2018 AI Study). Big data and AI are infamously providing corporations and governments with the means to know us "better than we know ourselves." Businesses no longer need to survey their customers to work out their product preferences, lifestyles, or even their state of health; instead, data analytics and machine learning algorithms, fueled by vast amounts of the "digital exhaust" we leave behind wherever we go online, are uncovering ever deeper insights about us. Businesses get to know us now automatically, without ever asking explicit questions.
In light of these developments, I re-issued my report, Big Privacy Rises to the Challenge of Big Data and AI, for today's privacy environment.
I fully acknowledge the fit among big data, artificial intelligence and data privacy standards is complex and one in which a dynamic balance must be maintained in order for new digital business models at the intersection of these emerging norms to ward off regulatory and consumer backlash. However, as communities increase their scrutiny over profits extracted from Personal Data, I am calling on digital businesses to implement Big Privacy. That is: exercise restraint with analytics and machine learning, to be transparent about their business models, to offer consumers a fair and transparent trade for data about them, and to innovate in privacy as well as data mining. In addition, businesses must not wait for data protection laws to strengthen, but should be proactive and guided by fast-moving community standards.