A masterful edition of Storyville exposed the awful plight of the moderators tasked with purging tech platforms of violent and sexually abusive images
One woman wanted to quit her job as a moderator for an unnamed tech company during training, after hearing descriptions of the content and images she was likely to see. Once she had started, she came across pictures of a six‑year-old girl having terrible things done to her and asked to leave. Her manager told her this was what she had signed up for and sent her back to work. Her story was preceded by footage from testimony before a committee on child abuse images and exploitation by Nicole Wong, then a legal adviser at Google. “Were doing the best we can,” she said.
We dont know the name of the woman haunted by images that still make her voice shake when she speaks of them. She is one of tens of thousands of moderators employed by companies in the Philippines, themselves hired by big tech firms, to purge social media platforms of the worst that humanity offers when you give it the chance. Like the rest of her colleagues, she could only speak without risk anonymously.