THE AMERICA ONE NEWS
May 31, 2025  |  
0
 | Remer,MN
Sponsor:  QWIKET 
Sponsor:  QWIKET 
Sponsor:  QWIKET: Elevate your fantasy game! Interactive Sports Knowledge.
Sponsor:  QWIKET: Elevate your fantasy game! Interactive Sports Knowledge and Reasoning Support for Fantasy Sports and Betting Enthusiasts.
back  
topic
Le Monde
Le Monde
15 Dec 2023


Images Le Monde.fr

Among the horrors of the war that broke out on October 7 between Israel and Hamas, there was one that unexpectedly added a dystopian dimension to the conflict: the Israeli army's use of artificial intelligence (AI) to maximize its attack on the Islamist movement. Its use of AI has been presented as one of the key components of the Israeli Defense Forces' targeting tools for its air strike campaigns on the Gaza Strip. The AI system is called Habsora (or "Gospel" in English).

It's hard to know to what extent this unexpected revelation, in early November, the day after the seven-day truce that led to the release of 110 hostages, was the result of a controlled communication strategy. The press had reported on the concerns of former members of the IDF regarding the use of this software, which can propose targets at unprecedented speed using diverse data. The words "artificial intelligence" are often a catch-all term, encompassing a wide range of digital methods, both for civilians and militaries.

Read more Article réservé à nos abonnés Israel is using AI to calculate bombing targets in Gaza

One thing is certain, according to experts: The scale of the destruction and number of victims in Gaza is unprecedented, with more than 18,000 people killed, according to the Hamas-run Ministry of Health. "There has been no consensus among specialists on this subject for years. This war could accelerate some debates," said Julien Nocetti, associate researcher at the French Institute of International Relations (IFRI), specializing in digital conflicts.

Automated weapons today fall into two main categories: Fully-automated lethal weapons systems, of which there are no real examples on the market, and lethal autonomous weapons (LAWs), which in principle allow humans to have control. The vast majority of Western military powers – and Israel, with Habsora – now claim to have opted for LAWs and can therefore claim to be on the more appropriate side of the use of force.

Laure de Roucy-Rochegonde, also a researcher at IFRI and the author of a thesis on the regulation of autonomous weapons systems, said the specifics of the war between Israel and Hamas could render these blurred categories obsolete and reinvigorate another regulatory concept, that of "significant human control." It's a stricter definition that some human rights activists, including the NGO Article 36, have been pushing for without much success. "The problem is that we don't know what kind of algorithm is being used [by the Israeli army], or how the data has been aggregated. It wouldn't be a problem if there wasn't a life-or-death decision at the end of it," said de Roucy-Rochegonde.

You have 50% of this article left to read. The rest is for subscribers only.