abusesaffiliationarrow-downarrow-leftarrow-rightarrow-upattack-typeburgerchevron-downchevron-leftchevron-rightchevron-upClock iconclosedeletedevelopment-povertydiscriminationdollardownloademailenvironmentexternal-linkfacebookfiltergenderglobegroupshealthC4067174-3DD9-4B9E-AD64-284FDAAE6338@1xinformation-outlineinformationinstagraminvestment-trade-globalisationissueslabourlanguagesShapeCombined Shapeline, chart, up, arrow, graphLinkedInlocationmap-pinminusnewsorganisationotheroverviewpluspreviewArtboard 185profilerefreshIconnewssearchsecurityPathStock downStock steadyStock uptagticktooltiptwitteruniversalityweb

这页面没有简体中文版本,现以English显示

文章

2024年4月20日

作者:
Arab News

OPT/I: WhatsApp allegedly used to target Palestinians through Israel’s Lavender AI system

"WhatsApp being used to target Palestinians through Israel’s Lavender AI system", 20 April 2024

WhatsApp is allegedly being used to target Palestinians through Israel’s contentious artificial intelligence system, Lavender, which has been linked to the deaths of Palestinian civilians in Gaza, recent reports have revealed.

Earlier this month, Israeli-Palestinian publication +972 Magazine and Hebrew-language outlet Local Call published a report..., exposing the Israeli army’s use of an AI system capable of identifying targets associated with Hamas or Palestinian Islamic Jihad.

This revelation,..., has sparked international outrage, as it suggested Lavender has been used by the military to target and eliminate suspected militants, often resulting in civilian casualties.

In a recent blog post, software engineer and activist Paul Biggar highlighted Lavender’s reliance on WhatsApp.

He pointed out how membership in a WhatsApp group containing a suspected militant can influence Lavender’s identification process, highlighting the pivotal role messaging platforms play in supporting AI targeting systems like Lavender.

“A little-discussed detail in the Lavender AI article is that Israel is killing people based on being in the same WhatsApp group as a suspected militant,” Bigger wrote. “There’s a lot wrong with this.”

He explained that users often find themselves in groups with strangers or acquaintances.

Biggar also suggested that WhatsApp’s parent company, Meta, may be complicit, whether knowingly or unknowingly, in these operations.

He accused Meta of potentially violating international humanitarian law and its own commitments to human rights, raising questions about the privacy and encryption claims of WhatsApp’s messaging service.

The revelation is just the latest of Meta’s perceived attempts to silence pro-Palestinian voices.

Responding to requests for comment, a WhatsApp spokesperson said that the company could not verify the accuracy of the report but assured that “WhatsApp has no backdoors and does not provide bulk information to any government.”