"Britain is not a state that is trying to search through everybody's emails and invade their privacy," according to Prime Minister David Cameron. "We just want to ensure that terrorists do not have a safe space in which to communicate."
Later this year the government intends to introduce legislation that will ensure that any form of communication, whether it's an email, text message, or video chat, can always be read by the police or intelligence services if they have a warrant.
Few would disagree with the idea that criminals shouldn't be allowed to plot in secret. But in reality there are huge technical, legal, and moral problems with what the British government wants to do, setting it on a collision course with both the tech industry and privacy campaigners.
"We have always been able, on the authority of the Home Secretary, to sign a warrant and intercept a phone call, a mobile phone call or other media communications, but the question we must ask ourselves is whether, as technology develops, we are content to leave a safe space -- a new means of communication -- for terrorists to communicate with each other. My answer is no, we should not be," the Prime Minister told Parliament recently.
While it might seem straightforward, there's every chance the government will not succeed in delivering such a plan.
Over the last two years, documents revealed by NSA-contractor-turned-whistleblower Edward Snowden have catalogued the scale of NSA and GCHQ snooping on our use of the internet. One small example: the intelligence agencies intercepted millions of webcam images, including sexually explicit ones, regardless of whether the people involved were intelligence targets.
In response, furious tech companies began to encrypt traffic - that is, scrambling it to make it impossible to be snooped on - as it travelled over the internet between their servers and their customers.
Such a use of encryption didn't really present a huge problem for spies and police, because companies still have to decrypt the data when it reaches their own servers. They do this in order to sift through their customers' emails and web browsing habits themselves, if only to hit them with more targeted advertising (which is why when you write an email about getting married you might start to see adverts for wedding venues). In this case, all the police have to do is apply for a warrant and they can get access to the messages they want.
But some tech companies - like Apple and the hugely popular WhatsApp messaging service - have gone further by using end-to-end encryption. It's a subtle but vital difference, because it means the company itself never sees the message or has any way of decrypting it. There is never a readable version of the messages on a server somewhere; if police or intelligence agencies demand access, there is simply nothing for the company to hand over.
While this sort of security used to be rare, it's now becoming commonplace, used for billions of messages sent using such services. As a result, intelligence chiefs are complaining that important sources of information are 'going dark', making it harder to track criminals and terrorists. As a result, the UK government is talking about new legislation to be introduced in the autumn.
There are plenty of reasons why the legislation designed to give it access to any message it wants won't work.
It's deeply unclear how the government will deal with encryption. In the most draconian scenario, the government could ban the use of encryption completely. Encryption underpins everything we do on the internet, so such a ban would, for example, let criminals read your credit card details as you shop online and leave your digitised medical records open to all.
The UK would immediately become the least secure place to do business in the world, and a target for every hacker on the planet. Consequently, a full-on encryption ban is unlikely.
Perhaps the UK government could force tech companies to stop using end-to-end encryption?
That might work for a UK-based service, but for international tech companies, the country is just one modest market among many. Many would either ignore the demand or pull out of the UK completely. Even if the UK government managed to force the big players to conform, there would still be plenty of smaller players who would cheerfully and loudly refuse. In any case, the code for encrypted messaging is freely available online and has been for decades: blocking access to it would be impossible. And there are even more complicated technologies, like steganography, that criminals can resort to if encryption is hampered.
Equally, there would be little to stop someone buying a smartphone abroad with strong encryption switched on. Would HM Customs be required to impound the smartphones of tourists to check if they had it enabled?
All of this likely means any legislation dealing with encrypted messages will struggle to be effective. But there are more troubling questions than the tactical issues of enforcement.
In this demand, the UK is out of step with its partners: Germany has a much different experience of the dangers of state surveillance and as a result positively encourages the use of encryption to protect personal information, even if it does make it harder to catch criminals. The US - where many tech companies are based - has not banned the use of encryption either and is unlikely to do so.
The countries that do place controls on encryption are uncomfortable bedfellows for a democracy. And if tech companies agreed to UK government demands, then other countries - Russia, or China perhaps - will feel emboldened and justified in asking for the same, making dissent even harder.
Pervasive use of technology is undermining our privacy quicker than encryption can protect it again.
And while the government may claim a moral imperative, privacy advocates make a compelling case, too.
The United Nations recently concluded that the use of encryption should be encouraged to protect freedom of expression. Speaking earlier this year, Apple CEO Tim Cook made a trenchant defence of the right to privacy: "History has shown us that sacrificing our right to privacy can have dire consequences," he said. "If those of us in positions of responsibility fail to do everything in our power to protect the right of privacy, we risk something far more valuable than money. We risk our way of life."
The use of encryption is for some a political act in itself, an act of solidarity as security expert Bruce Scheiener points out, because if only dissidents use encryption in a country, that country's authorities have an easy way of identifying them. If everyone uses encryption, it's much harder to pick out individuals: "No one can distinguish simple chatting from deeply private conversation. The government can't tell the dissidents from the rest of the population. Every time you use encryption, you're protecting someone who needs to use it to stay alive."
It's entirely understandable that the government wants to give police and intelligence services as many tools as possible to investigate crime and protect us against threats, but a crusade against encryption is almost certain to fail, and would have horrendous consequences if it did otherwise.
Even FBI director James Comey admits there is no simple answer here: "It may be that, as a people, we decide the benefits here outweigh the costs and that there is no sensible, technically feasible way to optimize privacy and safety in this particular context, or that public safety folks will be able to do their job well enough in the world of universal strong encryption."
It seems to me that the UK government must realise this too. The overheated rhetoric does nobody any favours and simply makes the government look badly informed, as it so often is on technology issues.
What's more likely than a ban on encryption is the introduction of new powers for law enforcement, enabling them to gain access to encrypted devices by other means: perhaps a legal framework to allow police to hack into encrypted devices might be on the cards, or greater powers to compel individuals to unlock devices or hand over encryption keys. Neither of these will resolve the problem entirely (hacking a device is hard, compelling someone to hand over a password isn't going to work for covert surveillance), but both are more practical than a self-defeating general ban on encryption.
Each new form of communication is more intimate than the last - the Apple Watch will even let you transmit your heartbeat to your loved ones. But each new form of communication also brings with it new opportunities to snoop; pervasive use of technology is undermining our privacy at least as quickly as encryption can protect it again.
The Prime Minister may insist that the state does not wish to invade the privacy of everyone, but the behaviour of the intelligence services and their appetite for mass collection of internet data tells a somewhat different story, and new technologies will make it easier to collect data on us all that ever before. What we need is a proper debate about how and where to draw a line: how to respect privacy and protect society at the same time.
More stories on surveillance and cybercrime
- Inside the secret digital arms race: Facing the threat of a global cyberwar
- Surveillance laws need rethink, but bulk collection of web data will continue
- The undercover war on your internet secrets: How online surveillance cracked our trust in the web
- The impossible task of counting up the world's cyber armies
- Encryption: More and more companies use it, despite nasty tech headaches