Content moderation is what a 21st century hazardous job looks like
“There are surely some content moderators that haven’t suffered mental health problems connected to the job, but I haven’t met them,” says sociologist and computer scientist Milagros Miceli, who has studied the content moderation industry for the past six years. “I have no doubt that content moderation, like coal mining, is a hazardous job.”...
Content moderators are essentially the security guards of social media. They are tasked by platforms like Facebook and TikTok to remove content that breaches their guidelines. The posts that they filter out include hate speech, violent, graphic and pornographic content (including child sexual exploitation), content from proscribed organisations such as terror groups, bullying and harassment, and suicide and self-harm...
While the work of content moderators is for the big social media platforms, they are hired almost exclusively through outsourcing firms, companies which are typically called ‘BPOs’ (‘business processing outsourcing’). A veil of secrecy surrounds this industry...
According to Casilli, the European content moderator industry has become highly concentrated in recent years, with a few big firms buying out rivals and dominating the sector. Teleperformance, Appen and Telus are three of the biggest players. These BPOs organise the industry in a call centre-style office environment where surveillance of workers is intensive and secrecy is a top priority. “Their contracts are extremely strict on non-disclosure agreements, they are really NDAs disguised as work contracts,” explains Casilli...
Another typical feature of the content moderator industry is that the workers are migrants. Casilli is one of the authors of Who Trains the Data for European Artificial Intelligence?, a new study by the EnCOre Initiative on click workers (including content moderators), commissioned by The Left Group in the European Parliament. The researchers held focus groups with content moderators working at BPO firms Telus and Accenture in Germany (in Berlin and Essen) and at an anonymised BPO firm in Portugal... From the BPOs to the NDAs to the migrant visas, the big social media platforms are protected from accountability for the working conditions of their content moderators by layers of deniability, secrecy and marginalisation. But behind these walls of opacity there exist real people with real stories, and some of them are determined to be heard, despite the barriers they face in speaking out...
The European Commission and Meta were also asked to respond to the EnCOre Initiative study, but did not offer a response for publication...