The Facebook Dilemma
Fifteen minutes into The Cleaners, the unsettling documentary about the thousands of anonymous “content moderators” working behind the scenes in third-world countries for Facebook, Instagram, and other social media companies, the filmmakers provide a perfect—if obvious—visual metaphor: they show a young Filipino woman walking through a garbage-strewn Manila slum as children pick through a trash heap. “My mom always told me that if I don’t study well, I’ll end up a scavenger,” she says. “All they do is pick up garbage. They rely on garbage. It’s the only livelihood they know…. I was afraid of ending up here, picking up garbage. It was one of the reasons that drove me to study well.” Instead, studying well landed her in a cubicle in an obscure office building, picking through the detritus of human behavior—the photos of child sexual exploitation, the calls to murder, the suicide videos, 25,000 items a day, with an allowance of only three errors per month before getting sacked—and deciding in an instant what should be deleted and what can stay.
“I’ve seen hundreds of beheadings in my complete career for content moderation,” a nameless young man says. “Not only pictures, even the videos. A two-minute video of the act of beheading.” Another talks of watching someone kill himself online, thinking at first it was a joke. And then there’s a young woman—they are all young—who confesses to having been sexually naive before taking the job: “The most shocking thing that I saw…was a kid sucking a dick inside a cubicle. And the kid was like really naked. It was like a girl around six years of age.” Before she worked as a content moderator, she says, she had never heard the word “dick,” let alone seen one.
To be clear, these were images that had already appeared on social media and had been flagged by users or by Facebook algorithms for possibly violating a site’s “community standards,” a nebulous term that seems to mean “stuff that could get us in trouble with someone,” like a government or a cohort of users.
Those standards, like much that has been created in Silicon Valley, grow out of a hands-off, responsibility-shunning, libertarian ethos. “We’re going to allow people to go right up to the edge [of what’s acceptable] and we’re going to allow other people to respond,” Tim Sparapani, Facebook’s director of public policy from 2009 to 2011, tells the journalist James Jacoby, whose two-part Frontline documentary, The Facebook Dilemma, offers the best background yet for everything we’ve been reading and hearing about the company’s derelictions these past few years. “We had to set up some ground rules,” Sparapani continues. “Basic decency, no nudity, and no violent or hateful speech. And after that we felt some reluctance to…
This is exclusive content for subscribers only.
Get unlimited access to The New York Review for just $1 an issue!
Continue reading this article, and thousands more from our archive, for the low introductory rate of just $1 an issue. Choose a Print, Digital, or All Access subscription.