abusesaffiliationarrow-downarrow-leftarrow-rightarrow-upattack-typeburgerchevron-downchevron-leftchevron-rightchevron-upClock iconclosedeletedevelopment-povertydiscriminationdollardownloademailenvironmentexternal-linkfacebookfiltergenderglobegroupshealthC4067174-3DD9-4B9E-AD64-284FDAAE6338@1xinformation-outlineinformationinstagraminvestment-trade-globalisationissueslabourlanguagesShapeCombined Shapeline, chart, up, arrow, graphLinkedInlocationmap-pinminusnewsorganisationotheroverviewpluspreviewArtboard 185profilerefreshIconnewssearchsecurityPathStock downStock steadyStock uptagticktooltiptwitteruniversalityweb
Article

10 Jan 2023

Author:
Cristina Criddle and Madhumita Murgia, the Financial Times

Kenya: Meta closes east African content moderation hub run by Sama

See all tags Allegations

Meta’s east African content moderation hub is shutting down as the social media giant’s third-party contractor moves away from policing harmful content, cutting around 200 staff and leaving several employees without work permits.

The owner of Facebook, WhatsApp and Instagram first contracted Sama in 2017 to assist with labelling data and training its artificial intelligence, hiring around 1,500 employees. But within two years, the Nairobi office was moderating some of the most graphic and harmful material on Meta’s platforms, including beheadings and child abuse.

Sama staff were told on Tuesday morning that the company would focus solely on labelling work — also known as “computer vision data annotation” — which includes positioning animations in augmented reality filters, such as bunny ears.

“The current economic climate requires more efficient and streamlined business operations,” Sama said in a statement encouraging employees to apply for vacancies at its offices in Kenya or Uganda. Some Sama staff rely on work permits to remain in the region.

Sama’s content moderation services will end in March, allowing for a transition period for Meta’s new third-party contractor. Meta will continue to employ 1,500 Sama staff for data labelling.

...

The Nairobi office focused on content generated in the region, including about the civil conflict in Ethiopia, which Meta is currently being sued for over claims that the posts incited violence. Meta’s policies ban hate speech and incitement to violence.

...

“We respect Sama’s decision to exit the content review services it provides to social media platforms. We’ll work with our partners during this transition to ensure there’s no impact on our ability to review content,” Meta added.

Sama is offering mental health support to staff affected by the cuts for 12 months after their employment ends and is paying undisclosed severance packages. Around 3 per cent of Sama staff are affected.

The cuts come as both Sama and Meta are being sued by a former employee Daniel Motaung, who has accused the companies of neglecting to provide adequate mental health support for moderators or fully informing them of the nature of the content they would be reviewing. Motaung also claims that the companies transported workers from poorer regions of Africa, where they had no choice but to stay in their employment.

Meta has previously declined to comment directly on the lawsuit.

“We’ve seen the consequences of cut-rate moderation in the Ethiopian war — and just this week in the attack on Brazil’s democracy. These crises were fuelled by social media,” said Cori Crider, a director at Foxglove, who has been supporting Sama and other Facebook moderators in legal action against the two companies.

Timeline