Facebook allegedly approves paid ads containing hate speech & incitement against Palestinians
"Facebook approved an Israeli ad calling for assassination of pro-Palestine Activist", 21 November 2023
A SERIES OF advertisements dehumanizing and calling for violence against Palestinians, intended to test Facebook’s content moderation standards, were all approved by the social network, according to materials shared with The Intercept.
The submitted ads, in both Hebrew and Arabic, included flagrant violations of policies for Facebook and its parent company Meta. Some contained violent content directly calling for the murder of Palestinian civilians, like ads demanding a “holocaust for the Palestinians” and to wipe out “Gazan women and children and the elderly.” Other posts, like those describing kids from Gaza as “future terrorists” and a reference to “Arab pigs,” contained dehumanizing language.
“The approval of these ads is just the latest in a series of Meta’s failures towards the Palestinian people,” Nadim Nashif, founder of the Palestinian social media research and advocacy group 7amleh, which submitted the test ads...
7amleh’s idea to test Facebook’s machine-learning censorship apparatus arose last month, when Nashif discovered an ad on his Facebook feed explicitly calling for the assassination of American activist Paul Larudee, a co-founder of the Free Gaza Movement.
Calling for the assassination of a political activist is a violation of Facebook’s advertising rules. That the post sponsored by Ad Kan appeared on the platform indicates Facebook approved it despite those rules. The ad likely passed through filtering by Facebook’s automated process, based on machine-learning, that allows its global advertising business to operate at a rapid clip.
While these technologies allow the company to skirt the labor issues associated with human moderators, they also obscure how moderation decisions are made behind secret algorithms.
Incitement to Violence on Facebook
Amid the Israeli war on Palestinians in Gaza, Nashif was troubled enough by the explicit call in the ad to murder Larudee that he worried similar paid posts might contribute to violence against Palestinians.
Curious if the approval was a fluke, 7amleh created and submitted 19 ads, in both Hebrew and Arabic, with text deliberately, flagrantly violating company rules — a test for Meta and Facebook. 7amleh’s ads were designed to test the approval process and see whether Meta’s ability to automatically screen violent and racist incitement had gotten better, even with unambiguous examples of violent incitement.
Meta’s appears to have failed 7amleh’s test.
Facebook spokesperson Erin McPike confirmed the ads had been approved accidentally. “
Just days after its own experimental ads were approved, 7amleh discovered an Arabic ad run by a group calling itself “Migrate Now” calling on “Arabs in Judea and Sumaria” — the name Israelis, particularly settlers, use to refer to the occupied Palestinian West Bank — to relocate to Jordan.
Either way, according to Nashif, the fact that these ads were approved points to an overall problem: Meta claims it can effectively use machine learning to deter explicit incitement to violence, while it clearly cannot.