

According to an investigation published on Wednesday, April 3, by Israeli media, artificial intelligence (AI) software was used by the Israeli army to designate "up to 37,000" of the Gaza Strip's 2.3 million inhabitants as Hamas and Islamic Jihad fighters or militants. The majority of these people and their families were then targeted by aerial bombardments during the first weeks of the war, which began on October 7, 2023, following terrorist attacks on Israeli territory.
"The Israeli army has developed a program based on artificial intelligence, known as Lavender (...), to generate targets for assassination," claimed +972 magazine and news site Local Call. This software "played a central role in the unprecedented bombing of Palestinians," wrote the author of the investigation, Israeli journalist and filmmaker Yuval Abraham, who was behind the first revelations in November 2023 about the Israeli army's use of AI.
According to the survey, Lavender was designed to identify any Hamas or Islamic Jihad fighter or militant, regardless of their importance in the military chain, by processing "massive quantities" of data: "Visual information, cell phones, social media connections, battlefield information, telephone contacts, photos." Until now, this identification was carried out by humans but was limited, due to a lack of resources, to high-ranking commanders.
Several witnesses interviewed claimed that Lavender is only 90% reliable, however. In other words, up to 10% of the people identified by the software – and therefore liable to be bombed – could have been mistakenly identified. "In wartime, we don't have time to incriminate every target. So we're prepared to take the margin of error of using AI, risking collateral damage and civilian deaths (...) and live with it," said a source within Israeli intelligence, quoted by the journalist.
Admitted number of collateral victims
Even more disturbing is the fact that Israel uses another software program, cynically named Where's Daddy? to locate the targets identified by Lavender. With this program, the army can determine when these suspected Palestinian fighters are in their homes and then bomb them while they sleep, in order to maximize the chances of killing them. "You enter hundreds [of targets] into the system and wait to see who you can kill," explained a source quoted in the investigation, speaking of "hunting at large."
According to the article, it was also decided at the start of the war that the death of 15 to 20 civilians was "acceptable" during bombardments if a target was located by the system. The number of collateral victims acceptable to the Israeli army would rise to 100 or even 300 for the highest-ranking individuals within Hamas. The choice could explain the very high Palestinian death toll – over 32,000, according to the Hamas Ministry of Health – recorded in six months.
You have 22.64% of this article left to read. The rest is for subscribers only.