X
Business

Custodians of the Internet, book review: Content moderation under the microscope

How do you moderate content on platforms of the scale of YouTube or Facebook? Tarleton Gillespie lays out the options and highlights the impact of 'Web 2.0' on society.
Written by Wendy M Grossman, Contributor
custodians-of-the-internet-book-main.png

Custodians of the Internet: Platforms, Content Moderation, and the Hidden Decisions That Shape Social Media • By Tarleton Gillespie • Yale University Press • 288 pages • ISBN: 978-0-300-17313-0 • £25 / $30

Image: Yale University Press

In 2016, Facebook came under fire for, among many other things, deleting an image posted by the Norwegian writer Tom Egeland, then reposted by numerous members of Norway's cabinet and the country's leading newspaper. The image was 'The Terror of War', the historic photo of Kim Phúc, then nine years old, fleeing from a napalm attack in South Vietnam in 1972.

What people have forgotten in the nearly 50 years since the photograph's original publication, Tarleton Gillespie writes in Custodians of the Internet, is that the same debates Facebook held, first internally and then publicly, about the appropriateness of publishing that particular photograph were first held among the newspaper editors of 1972. Where, they asked, should one draw the line between the public interest in exposing the horrors of the Vietnam War and the fact that the picture is of a young, naked girl? The decision Facebook made was no different than the decision some editors made in 1972. The difference lies in how the decision was made, by whom, and when.

Twenty years ago, when Lawrence Lessig wanted to write about regulation and the internet, he wrote about the decisions imposed by computer code. In Custodians of the Internet, Gillespie, an affiliated associate professor at Cornell University and a principal researcher at Microsoft Research, considers the often-overlooked role of moderation in social media, whose decisions about what content to allow "shape the shape of public discourse".

As practices such as editorial review, community flagging, and automated detection become widespread, users of these platforms are internalising them as "the way to do things". Yet that leaves us in an unsatisfying situation where we rarely have any insight into why this particular platform made this particular decision. It's only in cases like 'The Terror of War' that we are alerted to what went wrong.

SEE: IT pro's guide to GDPR compliance (free PDF)

Gillespie explores numerous platforms and cases: Facebook and breastfeeding, Tumblr and porn. There are, of course, many others from the past, such as those discussed in Digital Countercultures. UK readers will also note that the Internet Watch Foundation, the pioneer of public content flagging, makes no appearance and nor do any of the famous pioneers of human moderation such as Slashdot, Television without Pity, and The WELL.

That's fair enough, in that what Gillespie is trying to isolate is the effect of scale. All three of the latter communities were small enough and homogeneous enough to be manageable by humans. A site like YouTube, which is receiving uploads at a rate of an hour of video per second, cannot rely solely on humans -- especially given that those humans disagree strongly about what should be considered acceptable. Even an automated system with a 7 percent false positive rate, such as the Twitter acquisition that Gillespie highlights, when multiplied across billions of uploads, flags an arguably unmanageable amount of material for manual review by humans, who report being traumatised by the content.

Gillespie concludes with recommendations for change: increase transparency; distribute agency to users and communities as well as the work; make defences such as blocklists portable between platforms; replace popularity as a metric; and increase the diversity of the staff who design the systems.

All of these are good suggestions. Are they enough? Either way, Gillespie says, we, as well as the platforms themselves, must come up with solutions to the hard problems that 'Web 2.0' has brought.

RECENT AND RELATED CONTENT

Facebook's Mark Zuckerberg: 'The future is private'

Facebook: We stored hundreds of millions of your passwords in plain text

YouTube, Facebook and Google News will be 'directly affected' as Europe approves new internet copyright rules

Facebook co-founder Chris Hughes argues for a breakup of Facebook, but easier said than done

YouTube to disable comments on videos featuring all minors

Read more book reviews

Editorial standards