commentary Should businesses try to block employees' non-work-related Web access? And is it important how accurate those filters are?
Next month the Labs will be taking a look at Web filtering -- commonly referred to as "porn blocking" -- software for businesses. I thought it would be interesting to get some perspectives from the readers beforehand.
Articles in the media about the filtering debate are most often presented from the perspective of the end user and from a civil liberties angle, arguing that employees have a right to privacy, that bosses should trust their workers, and that the distrust employers show by installing filters is often repaid by a reduction of morale and loyalty by employees.
Staff can waste just as much time on the phone to friends as they do on personal Web surfing. There are no technological solutions to prevent personal calls or water cooler conversation, but there is a relatively simple technology to prevent, or at least discourage, personal Web surfing. But just because bosses can doesn't mean they should.
On the other hand, many staff abuse the "free" Internet connection at work. Before you download that 2GB pirate copy of the Alien vs Predator movie or spend all day getting live streaming video of the Olympics ask yourself, would you do this at home if you were paying for the connection?
I have even less sympathy for people who get caught with pornography at work. Admittedly, the cases that come to light every six months or so are the iceberg tip of a much wider problem, and those staff who lose their jobs are treated harshly for reasons that have more to do with PR than having broken the rules.
But gee, you don't have to be particularly bright to work out if you download, store, or share porn around the office and get caught out, you've only got yourself to blame. And it only takes one co-worker with a grudge, or who doesn't get your sense of humour...
Filtering companies are often accused of having a conservative social agenda, blocking out sites containing information about left-wing politics, gay and lesbian issues, contraception, safe drug use, and so on. Sometimes this is simply a lack on intelligence on part of the filtering mechanisms -- how can a keyword filter tell the difference between a site that promotes safe drug use and one that just talks about drugs?
|You don't have to be particularly bright to work out if you download, store, or share porn around the office and get caught out, you've only got yourself to blame.|
Most reviews of these products consist of compiling a list of naughty Web sites -- as well as non-naughty ones that might still get blocked -- and running each product through a script to see how "accurate" they are. In my experience, most products are about the same: reasonably accurate with the occasional false positive or false negative.
In a business context, does it make a difference? Philosopher Michel Foucault in his book Discipline & Punish: the birth of the prison discussed the Panopticon, a model prison designed by 19th-century utilitarian philosopher Jeremy Bentham. The Panopticon was designed so that guards were able to see prisoners at all times, but the prisoners couldn't see if guards were watching them. Foucault explained that if people think they might be observed, they will act all the time as if they are being observed and police their own behaviour. In the context of Web filtering, more important than the filtering itself or how accurate it is, is that staff know it's there.
Are you for or against Web filtering, or can you see both sides? Do you think it matters if the software makes mistakes occasionally? Let us know what you think at firstname.lastname@example.org.
This article was first published in Technology & Business magazine.
Click here for subscription information.