Why Wikipedia block went wrong

newsmaker Internet Watch Foundation chief Peter Robbins talks about self-regulation and the 'Scorpions' blocking incident that disrupted Wikipedia.
Written by Tom Espiner, Contributor

newsmaker The Internet Watch Foundation (IWF), an organization set up by internet service providers to monitor child sexual abuse Web sites, caused a furore in December when it attempted to block a page on online collaborative encyclopaedia Wikipedia.

Through a combination of technical factors, people wishing to edit Wikipedia were blocked from doing so, causing an outcry. The image the IWF tried to block was the LP cover for Virgin Killer, a 1976 album by German rock band Scorpions.

Peter Robbins, chief executive of the IWF, talked to ZDNet UK about the fallout from the decision to block the page, and whether self-regulation of internet content is effective.

Q: An IWF action over a Scorpions album cover had an adverse effect on Wikipedia, effectively censoring it for some users. Does Internet self-regulation work?
Robbins: Because of that particular incident, [which had an effect on] IWF's reputation across the world, we have been accused of being a self-elected set of individuals set on a particular cause.

We have lots of credible people on our board, to give reassurance of independent oversight of what we do. In relation to the Wikipedia incident, that particular matter came about after a judgment about an image of a prepubescent girl on the Web site, which we considered to be indecent. A combination of factors led to an unfair level of criticism. We accept there were a number of issues about the technology that didn't work, and those lessons will be taken forward.

What were the issues about the technology?
Issues included the way that Wikipedia was set up so that the edit facility was frozen, and the way filtering systems were set up at ISPs. Yet all the pressure came to us. Unfortunately the debate was touched off around censorship, but that's not our function. We publish a list [of content] that organizations block on a voluntary basis. But that is secondary to our main [activities], which have always been our hotline, and notice and takedown. That's what self-regulation brought about.

Is the IWF blocking effective?
We heard later on that people could get around the [Wikipedia] blocking. Other attempts to evade blocking will be dealt with by the police. Our core business is notice and takedown, but in the U.K. there are very few occasions now where we have to issue notices. It's rare for people in the U.K. to host that content. There are 33 other hotlines around the world that we work with, all doing much the same, trying to deal with child sexual abuse.

What issues are there with people's perceptions of what the IWF does? Surely it's generally agreed that blocking child sexual abuse images is a good thing?
Yes, but the suspicion is about what else is on the [block] list. In the light of the Wikipedia incident, there is a deep suspicion of what's on the list. How do people know it's indecent images of children being blocked and not, say, politically sensitive information? The last thing we want is to have our list compromised by having sites on the list outside of our remit. We don't have a problem with anyone who is legally authorized seeing the content of the list. The list is seen by independent auditors every three years.

Who performs the audit?
The last audit was performed by [LSE forensics expert] Peter Sommer, assistant chief constable Stuart Hyde of the West Midlands police, June Thorburn, professor of social work at the University of East Anglia, and Jim Warnock, head of operations at CEOP [Child Exploitation and Online Protection Centre]. They came in April or May last year and gave us a clean bill of health. The point is, they came in to oversee our systems and processes. Apart from having an independent board of governors, we periodically bring in a group to audit our processes, every three years.

Who sees the list?
The auditors see the list, the 33 hotlines around the world, plus police and law enforcement agencies can have access. The list is also available to over 70 accredited, licensed organizations. We don't give the list to people who we think couldn't properly handle the content, because it is highly sensitive.

How do you decide on the list content?
The block list is dynamic, as sites go up and down all of the time. We don't automate the list. Every site on the list has an assessed image. But we block differently--www.childporn.com as a site would get blocked, but a single image can be blocked. We don't want to overblock.

Why did you decide to block the Wikipedia image of the Scorpions album cover?
It's about judgments you make about images. On Wikipedia there was an image of a prepubescent girl with no clothes on posing provocatively, and that fails U.K. guidelines. However, the image has been around for a long time.

If the image was provocative, why did you unblock it?
We didn't want to be arguing about the legality of blocking it if people were proliferating the image, copy and pasting it. We wanted to get back to our core business of notice and takedown, to get to Web sites around the world. Then there were problems with the tech blocking the content of an internet archive, which is also what happened with the Wayback Machine.

Yes, that block happened about a month after Wikipedia. How did that happen?
Some of the content on the Wayback Machine Internet archive went onto our list, and because of a technical hitch users were denied access to it. We provided the list, but it was the interaction of technologies that broke it. We are not a technological organization. Wikipedia was similar--they were reading traffic coming to the site, which we would not have known when we put them on our list.

Will you change how you assess images in the light of the Wikipedia incident?
We are going to change our systems to deal with the context and the history of the image, and whether the content is available on an innocent site. We learnt lessons from this.

Editorial standards