abusesaffiliationarrow-downarrow-leftarrow-rightarrow-upattack-typeburgerchevron-downchevron-leftchevron-rightchevron-upClock iconclosedeletedevelopment-povertydiscriminationdollardownloademailenvironmentexternal-linkfacebookfiltergenderglobegroupshealthC4067174-3DD9-4B9E-AD64-284FDAAE6338@1xinformation-outlineinformationinstagraminvestment-trade-globalisationissueslabourlanguagesShapeCombined Shapeline, chart, up, arrow, graphLinkedInlocationmap-pinminusnewsorganisationotheroverviewpluspreviewArtboard 185profilerefreshIconnewssearchsecurityPathStock downStock steadyStock uptagticktooltiptwitteruniversalityweb

The content is also available in the following languages: 한국어

Article

3 Apr 2024

Author:
Bethan McKernan and Harry Davies, The Guardian

Report reveals Israel used AI to identify thousands of alleged Hamas targets in Gaza

"‘The machine did it coldly’: Israel used AI to identify 37,000 Hamas targets", 3 April 2024

The Israeli military’s bombing campaign in Gaza used a previously undisclosed AI-powered database that at one stage identified 37,000 potential targets based on their apparent links to Hamas, according to intelligence sources involved in the war.

In addition to talking about their use of the AI system, called Lavender, the intelligence sources claim that Israeli military officials permitted large numbers of Palestinian civilians to be killed, particularly during the early weeks and months of the conflict.

Their unusually candid testimony provides a rare glimpse into the first-hand experiences of Israeli intelligence officials who have been using machine-learning systems to help identify targets during the six-month war.

“This is unparalleled, in my memory,” said one intelligence officer who used Lavender, adding that they had more faith in a “statistical mechanism” than a grieving soldier. “Everyone there, including me, lost people on October 7. The machine did it coldly. And that made it easier.”

Another Lavender user questioned whether humans’ role in the selection process was meaningful. “I would invest 20 seconds for each target at this stage, and do dozens of them every day. I had zero added-value as a human, apart from being a stamp of approval. It saved a lot of time.”

The testimony from the six intelligence officers, all who have been involved in using AI systems to identify Hamas and Palestinian Islamic Jihad (PIJ) targets in the war, was given to the journalist Yuval Abraham for a report published by the Israeli-Palestinian publication +972 Magazine and the Hebrew-language outlet Local Call.

...All six said that Lavender had played a central role in the war, processing masses of data to rapidly identify potential “junior” operatives to target. Four of the sources said that, at one stage early in the war, Lavender listed as many as 37,000 Palestinian men who had been linked by the AI system to Hamas or PIJ.

Lavender was developed by the Israel Defense Forces’ elite intelligence division, Unit 8200, which is comparable to the US’s National Security Agency or GCHQ in the UK.

Several of the sources described how, for certain categories of targets, the IDF applied pre-authorised allowances for the estimated number of civilians who could be killed before a strike was authorised.

Two sources said that during the early weeks of the war, they were permitted to kill 15 or 20 civilians during airstrikes on low-ranking militants. Attacks on such targets were typically carried out using unguided munitions known as “dumb bombs”, the sources said, destroying entire homes and killing all their occupants.

Responding to the publication of the testimonies in +972 and Local Call, the IDF said in a statement that its operations were carried out in accordance with the rules of proportionality under international law. It said dumb bombs are “standard weaponry” that are used by IDF pilots in a manner that ensures “a high level of precision”.

The statement described Lavender as a database used “to cross-reference intelligence sources, in order to produce up-to-date layers of information on the military operatives of terrorist organisations. This is not a list of confirmed military operatives eligible to attack.

Lavender created a database of tens of thousands of individuals

Lavender created a database of tens of thousands of individuals who were marked as predominantly low-ranking members of Hamas’s military wing, they added. This was used alongside another AI-based decision support system, called the Gospel, which recommended buildings and structures as targets rather than individuals.

The accounts include first-hand testimony of how intelligence officers worked with Lavender and how the reach of its dragnet could be adjusted. “At its peak, the system managed to generate 37,000 people as potential human targets,” one of the sources said. “But the numbers changed all the time, because it depends on where you set the bar of what a Hamas operative is.”

They added: “There were times when a Hamas operative was defined more broadly, and then the machine started bringing us all kinds of civil defence personnel, police officers, on whom it would be a shame to waste bombs. They help the Hamas government, but they don’t really endanger soldiers.”

The IDF’s targeting processes in the most intensive phase of the bombardment were also relaxed, they said. “There was a completely permissive policy regarding the casualties of [bombing] operations,” one source said. “A policy so permissive that in my opinion it had an element of revenge.”

The testimonies published by +972 and Local Call may explain how such a western military with such advanced capabilities, with weapons that can conduct highly surgical strikes, has conducted a war with such a vast human toll.

When it came to targeting low-ranking Hamas and PIJ suspects, they said, the preference was to attack when they were believed to be at home. “We were not interested in killing [Hamas] operatives only when they were in a military building or engaged in a military activity,” one said. “It’s much easier to bomb a family’s home. The system is built to look for them in these situations.”

Such a strategy risked higher numbers of civilian casualties, and the sources said the IDF imposed pre-authorised limits on the number of civilians it deemed acceptable to kill in a strike aimed at a single Hamas militant. The ratio was said to have changed over time, and varied according to the seniority of the target.

According to +972 and Local Call, the IDF judged it permissible to kill more than 100 civilians in attacks on a top-ranking Hamas officials.

The IDF statement said its procedures “require conducting an individual assessment of the anticipated military advantage and collateral damage expected … The IDF does not carry out strikes when the expected collateral damage from the strike is excessive in relation to the military advantage.” It added: “The IDF outright rejects the claim regarding any policy to kill tens of thousands of people in their homes.”

Experts in international humanitarian law who spoke to the Guardian expressed alarm at accounts of the IDF accepting and pre-authorising collateral damage ratios as high as 20 civilians, particularly for lower-ranking militants. They said militaries must assess proportionality for each individual strike.

Timeline