Kenya: Meta closes east African content moderation hub run by Sama
摘要
日期: 2023年1月10日
地點: 肯亞
企業
Sama - Employer , Meta (formerly Facebook) - Client受影響的
受影響的總人數: 200
移民和移民工人: ( 數字未知 - 地點未知 , 互聯網公司 , Gender not reported , Documented migrants )議題
Dismissal , 心理健康回應
已邀請回應:是,由Financial Times
回應的外部鏈接: (查看更多)
後續行動: Sama offered mental health support to the 3% staff affected by the cuts for 12 months after the employment ends and reportedly paid undisclosed severance packages. Meta said it was working with "partners during this transition" to ensure no impact to the content review surface.
資訊來源: News outlet
Meta’s east African content moderation hub is shutting down as the social media giant’s third-party contractor moves away from policing harmful content, cutting around 200 staff and leaving several employees without work permits.
The owner of Facebook, WhatsApp and Instagram first contracted Sama in 2017 to assist with labelling data and training its artificial intelligence, hiring around 1,500 employees. But within two years, the Nairobi office was moderating some of the most graphic and harmful material on Meta’s platforms, including beheadings and child abuse.
Sama staff were told on Tuesday morning that the company would focus solely on labelling work — also known as “computer vision data annotation” — which includes positioning animations in augmented reality filters, such as bunny ears.
“The current economic climate requires more efficient and streamlined business operations,” Sama said in a statement encouraging employees to apply for vacancies at its offices in Kenya or Uganda. Some Sama staff rely on work permits to remain in the region.
Sama’s content moderation services will end in March, allowing for a transition period for Meta’s new third-party contractor. Meta will continue to employ 1,500 Sama staff for data labelling.
...
The Nairobi office focused on content generated in the region, including about the civil conflict in Ethiopia, which Meta is currently being sued for over claims that the posts incited violence. Meta’s policies ban hate speech and incitement to violence.
...
“We respect Sama’s decision to exit the content review services it provides to social media platforms. We’ll work with our partners during this transition to ensure there’s no impact on our ability to review content,” Meta added.
Sama is offering mental health support to staff affected by the cuts for 12 months after their employment ends and is paying undisclosed severance packages. Around 3 per cent of Sama staff are affected.
The cuts come as both Sama and Meta are being sued by a former employee Daniel Motaung, who has accused the companies of neglecting to provide adequate mental health support for moderators or fully informing them of the nature of the content they would be reviewing. Motaung also claims that the companies transported workers from poorer regions of Africa, where they had no choice but to stay in their employment.
Meta has previously declined to comment directly on the lawsuit.
“We’ve seen the consequences of cut-rate moderation in the Ethiopian war — and just this week in the attack on Brazil’s democracy. These crises were fuelled by social media,” said Cori Crider, a director at Foxglove, who has been supporting Sama and other Facebook moderators in legal action against the two companies.