The key to cleaning up the internet is tackling the darknets, not letting censorship in by the back door

The UK government's proposals for blocking search terms for illegal content aren't only badly thought through, they're dangerous.
Written by Simon Bisson, Contributor

The latest proposals to lock down the UK internet in the name of preventing child pornography are at best a misunderstanding of how the dark side of the internet works, and at worst a basis for a censorship infrastructure that could make the Great Firewall of China look like a leaky sieve.

In an interview with the BBC, prime minister David Cameron proposed that search engines should block certain terms, warning users of the consequences of searching for those terms.

While that's all very well, it's an approach that's not going to stop the real trade in illegal images — which never touches the big search engines, and hides behind encryption and custom-built networks that Peter Biddle and three other Microsoft engineers christened "darknets" in their 2002 paper. That flaw makes the proposals both misguided and dangerous, as the Open Rights Group notes in its considered response.

The problem facing anyone trying to block child porn or online drug dealing is that it doesn't happen on the public internet. Online criminals know what they're doing is illegal, and they'll take complex precautions to hide their locations and the services.

Even back in the late 1990s they were chaining proxy services together in order to hide their locations. I remember working to help track down someone who was trading very nasty materials indeed through a webmail service I was consulting on at the time. We were able to track them down to an anonymising proxy that was hosted in Columbia, and that was where the trail ended — and the technology today is many orders of magnitude more sophisticated.

Silk Road, an anything-goes trading site, is an example of the types of technology that darknets use. It's a hidden service somewhere behind a NAT router, only accessible through the Tor secure internet toolset. Hidden from the rest of the internet, services like Silk Road aren't indexed by search engines, and are only accessible by those who know the secret address, and how to use the technologies they're hidden behind.

While Silk Road is a publicly-known darknet site, there are many, many more that are only known to a small group of trusted individuals, bound together to secrecy in the knowledge that what they are doing is illegal. It's on sites like those that illegal images and video are traded and shared, and bought and sold.

You won't find them in the web space your ISP gives you, or through searches on Google or Bing. They're squirreled away at the end of a DSL line somewhere well away from the jurisdiction of the UK government, in a country with loose regulations, and looser policing. Or worse still, they're hosted in the fast flux DNS of a bot network, distributed across the unwitting PCs of hundreds or thousands of innocent users.

Stopping the web's bad guys is not a matter of censoring the internet. That's impossible. What's needed instead is an international agreement on notice and take down for illegal content, and on shared intelligence about the servers and services criminals are using, with cooperation on shutting down botnets and cybercrime syndicates.

Cooperation vs conscription

Instead of making self-righteous speeches about how ISPs and search engines aren't doing enough, governments need instead to be working with them, and with law enforcement bodies to put in place those much needed international frameworks and treaties that would have allowed enquiries to go beyond that open proxy in Columbia — along with the funding they need to operate.

There's nothing that beats pure intelligence-led policing to break darknets and to shut down criminal trade in images and information. So why aren't governments investing in creating the police units that are needed to handle the complex enquiries that are necessary?

Blaming a search engine or an ISP isn't the right approach here — especially if, as reports from last week are anything to go by, the relationship between government and internet companies is at breaking point.

Legislating for the wrong thing in a fit of short-sightedness may seem to be a short cut to compliance, but it's a very dangerous road to start down.

Blocking search terms, even those that are "depraved and disgusting", is a distraction from the real police work that needs to be done, from building intelligence networks and effective software tools to expose and shut down darknets and the financial engines behind them.

Instead it's a sop to elements of the press that clamour for something to be done, something to protect the children, that fails to even address the underlying issue of the existence of darknets and how they actually operate.

It's a proposal that also puts in place a mechanism for internet censorship that can be quickly used to block other terms that might not be acceptable to a future government — a censorship engine that's similar to those we're telling oppressive regimes to take down. How then can we carry on to claim the moral high ground?

What the UK government should be concentrating on is an effort to break the financial ties that hold the darknets together. Finding who holds the purse strings is a complex task, but it's a technique that's been proven to work time and time again. And perhaps it should also be noted that it's an approach that's well within the capabilities of the powerful surveillance tools that government security agencies have put in place to monitor social connections and financial traffic online as part of their efforts to combat terrorism.

Perhaps then, we can make a modest proposal in the Swiftian sense, and note that this is an opportunity to use the panopticon we've built in PRISM and TEMPORA and all those other codewords for good, letting us see beyond the anonymising proxies, and finally shut down those darknets that lurk deep behind their own encrypted networks. And then we might just leave the ISPs and search engines alone to do what they do best, rather than co-opting them as unwilling engines of law enforcement.

Editorial standards