Designed to update the privacy rights of internet users and ensure organisations are transparent and responsible when handling the personal information of customers and clients, the European Union's General Data Protection Regulation (GDPR) laws came into force on May 25 last year.
SEE: My stolen credit card details were used 4,500 miles away. I tried to find out how it happened (cover story PDF) (TechRepublic)
The legislation had been in the works for a number of years, but its introduction into law came as data privacy and consent were already topping the news agenda following episodes such as the Facebook and Cambridge Analytica data scandal.
GDPR was designed to protect EU citizens' data, but the open nature of the web inevitably means it has an impact beyond its own shores. Even companies outside of the EU will often have to comply with the data protection legislation – for example, if they offer goods or services to EU citizens or if they have a branch somewhere within the trading bloc.
SEE: IT pro's guide to GDPR compliance (free PDF)
This extended reach of GDPR has lead to some unexpected outcomes. One example: European internet users looking to visit some US-based news publications may find that they can't view the websites – instead being met with pages explaining the publication didn't comply with the new legislation and blocked them out instead.
Some eventually found solutions to this, while a year on from the legislation being introduced some US publications continue to only show a holding page to European visitors.
But beyond the flood of emails asking for your explicit consent to be marketed to, or the notices you see on websites warning of the presence of third-party cookies, there is a bigger shift taking place.
"To a large extent in the US, most users attribute GDPR with an influx of cookie notifications and see it as an annoyance, rather than what it is: an attempt by regulators to give the consumer a level of visibility and control over what data is being collected about them," says Tim Mackey, senior technical evangelist at Synopsys.
But soon enough, even for businesses that have no involvement with the EU, there may be no hiding from data protection legislation as countries and regions around the world look to implement their own privacy laws, including Brazil, Japan, South Korea, India and others.
One of those is the home of Silicon Valley, California, which is set to introduce the California Consumer Privacy Act as of January 1 2020.
The legislation appears to have taken cues from GDPR when it comes to allowing individuals to have a greater say about how their personal data is used, but in many ways it doesn't go nearly as far. The law doesn't set a time limit for notifying consumers of a data breach like GDPR does and neither does it come with the prospect of fines for non-compliance.
However, even before new data protection legislation is introduced into different parts of the world, GDPR appears to be having some sort of effect on how some of the giants of Silicon Valley operate.
SEE: GDPR compliant? Here's a handy five-step preparation checklist
Apple CEO Tim Cook has called for the US to introduce an equivalent to GDPR to prevent data being weaponised against users. Facebook CEO Mark Zuckerberg recently spoke about how privacy will be the future of Facebook – even although he admits himself that some may find that hard to believe.
Google also appears to be making changes to the way it operates – and that's despite appealing a 50m fine issued to it by French data protection authorities after the company was found to be engaging in "forced consent" and lacking a sound legal basis for processing people's data.
The web giant recently announced a new auto-delete feature which automatically deletes location, app and web history after either a three-month or 18-month period as opposed to requiring users to delete data manually.
While only a small step towards additional privacy, it's possible that the introduction of GDPR has helped spur this change on, as companies like Google work to accommodate users becoming more aware about digital privacy.
"One of the outcomes of the Google fine was that Google had to begin making decisions around the structure of data collection and privacy management out of their Irish office and not just California," says Mackey.
If there's one thing which GDPR achieved, it is raising awareness about data privacy issues – even if that awareness only emerged after web users were inundated with emails asking for consent for their data to be processed in the run up to May 25 last year.
But in many ways, it's still only scratching the surface of issues about personal data, privacy and consent, with some quarters now pushing for ethics around information and technology to follow a similar pattern to GDPR.
It's within the realm of artificial intelligence where this could have the most impact, as many AI-based algorithms rely on gathering and analysing vast amounts of data – and it's not always clear where that data came from or whether the individuals it involves have given their consent.
You can put up a sign saying facial recognition technology is deployed in an area, but if that's an airport and an individual needs to travel through it, its not clear what option do they really have when it comes to refusing consent, aside from turning back and going home – when their face may already be on the system having already entered the airport.
"This debate around ethical practices is raging with AI," says Emma Wright, commercial technology partner at law firm Kemp Little.
"AI allows the mass processing and analysis of data. In lots of areas, we're suddenly looking for general counsels to be looking at the ethics of something, not just the legalities of something. It's not how can you behave, it's how should you behave."
It's a complicated area to attempt to regulate for the benefits of consumers, but GDPR can help provide a framework for assessing the ethical implications of crunching personal data.
"We're already starting to see the addition of ethics to the privacy discussion. GDPR provided us with a lot good approaches to really think about risk assessment and mitigation," says Enza Iannopollo, senior analyst for risk and security at Forrester.
This is especially the case when it comes to new technologies like AI and the Internet of Things, which rely on collecting vast piles of data and seeing how it can be used, rather than collecting data for a purpose.
"Because of that, the discussion is moving to ethics and risks around emerging technology. It's all about identifying risk and understanding how am I mitigating this risk, am I ready?, Iannopollo says. "GDPR has triggered this discussion about ethics, how do I tackle it and the values I have as an organisation. This is what we're going to see moving forward."
We've already seen the backlash which can occur when a company is found to be harnessing vast amounts of data for no particular reason – and if GDPR-like legislation spreads around the world and starts to include regulations around ethics of data collection in addition to consent, many organisations could find themselves being forced to answer difficult questions.
"Asking for everything and mining that data needs to be a thing of the past," says Mackey. "If everyone has the data, everyone in the organisation will want it, even if it negatively effects their consumer base."
Restricting that access could be a difficult fight, given the nature of how Silicon Valley operates, its resistance to legislation and the 'move fast and break things' mentality.
"The core challenge is one where the tech companies in Silicon Valley are in a land grab for whatever the new idea might be and to entrench themselves as the dominant player. As they go through this, the velocity of innovation that is part of that entire culture is one that regulators will have a hard time keeping up with," Mackey says.
SEE: GDPR: A cheat sheet (TechRepublic)
One particular area where Wright voices concerns about is the ethics of personal data collection when it comes to children.
"We still haven't properly addressed how we're creating digital identities for children and how they move away from that. The consent issue and the profiles that are being built by schools, parents and all manner and how we keep a sense of privacy for kids," says Wright.
"I think that's going to start coming to the front: people will start thinking about what this means for behavioural advertising and being able to process huge amounts of data. Nobody really thinks about how you can't remove these things from the web," she adds.
These next stages will come eventually, but in the meantime, governments around the world continue to examine the introduction of GDPR-like data protection legislation to start the ball rolling on data privacy and consent.
GDPR does have its issues, but if there's one thing it has done, it's raised awareness around privacy legislation to such an extent that countries around the world are examining laws around it in an effort to boost the privacy rights of citizens – at least against private companies.
"It will continue to set the gold standard and it will be seen as the practice to emulate," says Wright.