Bosnia-Herzegovina: Meta allegedly silences Srebrenica genocide content
"Guessing Game: Facebook’s Unpredictable Algorithm Removes Content on Srebrenica Genocide", 12 July 2024
Media in Bosnia that reported on the 29th anniversary of the genocide in Srebrenica committed by Bosnian Serb forces say they have received warnings that their content breaks Facebook’s community standards, and that some of their posts have been removed.
Since the outbreak of the Israel-Hamas war last October, the social media giant, owned by Meta, has reportedly pushed hard on its always-changing algorithm to suppress voices criticising Israel’s conduct in Gaza.
But the same algorithm, according to experts, is not precise, especially when dealing with small markets like one in Bosnia and Herzegovina.
A senior assistant at the department of information sciences at Sarajevo’s Faculty of Philosophy, Fedja Kulenovic, said the reason for the content removal is “most probably use of the word ‘genocide’ in social media posts or inside the articles.
“Since they have automatised systems for content removal, my educated guess is that these cases were connected to the questions of Palestine and Israel,” Kulenovic told BIRN.
Meta did not reply to BIRN’s query by the time of publication.
‘We don’t know what was removed’
Posting content that is deemed to violate Facebook’s rules can have far-reaching consequences for small media outlets, which rely on the platform’s sheer scale to reach an audience and attract advertisers.
Repeated occurrences of content being flagged as false or misleading can result in a media’s visibility being reduced, or it being locked out altogether.
Sabina Mesic, editor at Tuzlarije news site, from the northern Bosnian city of Tuzla, said Facebook only sends a notification that a post has been removed from their Facebook page, without notifying them what post exactly.
Several other outlets had a similar experience while reporting on July 11, the annual date of the commemoration of the 1995 Srebrenica genocide victims, held in the eastern Bosnian town.
Journalists from NEON TV from the town of Kalesija also reported that the content on one of their pages was removed for violating Facebook’s community rules.
The TV station runs two separate Facebook pages, but the same article, which had a cover photo of the “Flower of Srebrenica”, was removed from only one of them. “The article stayed on another page without any issue, and we can’t explain why,” Pasalic added.
Both Pasalic and Mesic told BIRN that this was not the first time that content was removed from their pages, without any explanation, or logical reasons why.
“We had an article about the head of Bosnia’s football selection and that article was removed for breaking the rules,” Pasalic told BIRN. “The problem is that when you receive the warnings a few times, there is a possibility that you can be locked out from your page, or the same can be removed,” Mesic said.
As BIRN previously reported, small media outlets face significant risks when posting content that violates Facebook’s rules, leading to reduced visibility or account lockouts. Facebook’s algorithm, a closely guarded secret, determines content reach through a relevancy score based on numerous signals. Content moderation inconsistencies, especially in Bosnia, exacerbate these challenges, highlighting the need for local understanding and ethical journalism.
Another BIRN report also showed that Facebook’s content moderation faces significant challenges in the Balkans. Despite using a mix of artificial intelligence and human review, Facebook’s tools often fail to accurately assess content in local languages like Bosnian, Serbian, Montenegrin, and Macedonian. While the platform proactively detects and removes a substantial amount of hate speech globally, the specific nuances and context of Balkan languages and cultural issues are frequently overlooked.
This leads to inconsistent moderation, where harmful content sometimes remains online, and benign content is mistakenly removed. The lack of transparency in Facebook’s moderation processes exacerbates these issues, leaving users uncertain about rule enforcement.
Silencing voices from Palestine
In its report from December last year, watchdog organisation Human Rights Watch, HRW, warned that Meta’s “policies and practices have been silencing voices in support of Palestine and Palestinian human rights on Instagram and Facebook”.
From October and November 2023 alone, Human Rights Watch documented over “1,050 takedowns and other suppression of content on Instagram and Facebook that had been posted by Palestinians and their supporters, including about human rights abuses”. It also noted increased censorship of social media since October 7.
Kulenovic, who often works with media in the cases of social media censorship, has noticed the same increase in Bosnia, as a consequence of imprecise algorithms.
The problem with such an approach, with bots being in charge of content moderation, is that “they can tell content, but not context”, he said.
“They [algorithms] can be very useful in moderating content, but when they are not written well such algorithms remove content even when it is related to factually proven things, such as the genocide in Srebrenica,” Kulenovic said.