United Nations (UN) Secretary-General Antonio Guterres has expressed concern over reports of Israel’s use of artificial intelligence (AI) in the attack on Gaza.
“I am deeply concerned by reports that the Israeli military’s bombing campaign includes AI as a target identification tool, resulting in high levels of civilian casualties. AI should be used as a force for good for the world’s benefit; not to contribute to industrial-scale warfare, obfuscating accountability,” Guterres said in a post he shared late Friday on social network X.
His words come amid intensifying international scrutiny of Israel’s military campaign, after targeted airstrikes killed several foreign aid workers delivering food to the Palestinian enclave.
A recent investigation by online news publication +972 Magazine also revealed that the Israeli military used AI to help identify bombing targets in Gaza and cited Israeli intelligence officials who were involved in the alleged program.
When asked about the allegations, an Israeli military spokesman did not dispute the tool’s existence but denied that AI was used to identify suspects.
Israeli sources say Israel risks at least 20 “civilian casualties” for every 37,000 “suspects” identified by an artificial intelligence program called “Lavander” to identify “human targets” in attacks on the blockaded Gaza Strip.
Sources from Tel Aviv testified for the media houses “+972” and “Local Call” that “Lavander” analyzed data on about 2.3 million people in Gaza according to unclear criteria and assessed whether any of the persons had ties to Hamas.
A total of six sources stated that the Israeli army “fully complied” with that program, especially in the early stages of the war, and the names identified by “Lavander” were labeled as targets without control and without taking into account any special criteria, except that it’s about men.
– 37,000 “suspected” Palestinians –
Sources who testified to “+972” said that the concept of “military target”, which allows killing on private property even if there are civilians in the facility and surroundings, previously included only high-level military targets, and that after October 7 concept extended to all members of Hamas.
Due to the enormous increase in the number of targets, the need for artificial intelligence has arisen because the possibility of examining and checking targets individually by humans has been eliminated, and sources also state that artificial intelligence has marked close to 37,000 Palestinians as “suspects”.
The sources said that “Lavander” was very successful in classifying Palestinians, and that the process was fully automated.
“We killed thousands of people. We automated everything and did not control each target separately. We bombed the targets as soon as they moved in their houses,” the source said, confirming that human control of the targets had been eliminated.
The comment of one of the sources that he found it very surprising that they were asked to bomb a house to kill an unimportant person is a sort of acknowledgment of the Israeli massacre of civilians in Gaza.
– Green light for high level targets with up to 100 civilian casualties –
The sources stated that up to “20 civilian victims” were allowed in the action that was carried out against the lower ranks, and that this number often changed during the process, and they emphasized that the principle of proportionality was not applied.
On the other hand, it was stated that the number of possible collateral civilian casualties increased to 100 for high-level targets.
While the sources said they were ordered to bomb every place they could, one of the sources said that hysteria dominated senior officials and all they knew was to bomb like crazy to limit Hamas’s capabilities.
A senior soldier with the initials B., who used the “Lavander” program, said that the margin of error of the program is about ten percent and that there is no need for people to control targets and waste time on it.
Israeli soldier B stated that in the beginning there were fewer labeled targets, but that with the expansion of the definition of Hamas members, the practice was further expanded and the number of targets grew. He added that members of the police and civil protection who may have helped Hamas, but who were not a threat to the Israeli army, were also targeted.
“There are many shortcomings of the system. If the target person gave their phone to another person, that person is bombed at home with their entire family. This happened very often. This was one of the most common mistakes Lavander made,” said Soldier B.
– Most of the killed are women and children –
On the other hand, the same sources said that the software called “Where’s Daddy?” it tracks thousands of people at a time and notifies Israeli authorities when they enter their homes. Attacks are also carried out on the database of this program.
“Let’s say you calculate that there is one member of Hamas and ten civilians in the house, usually those ten people are women and children. So, absurdly, most of the people you kill are women and children,” said one of the sources.
– Unguided bombs are used to save money –
Sources also said that many civilians were killed because less important targets were hit with ordinary and cheaper missiles instead of guided smart missiles.
“We usually carried out the attacks with unguided missiles, which meant literally destroying the entire house with its contents. The system kept adding new targets,” one of the sources said.
– Artificial intelligence is not used to reduce civilian casualties, but to find more targets –
Speaking to Al Jazeera on the subject, Marc Owen Jones, professor of Middle East Studies and Digital Humanities at Hamid bin Khalifa University in Qatar, said it was increasingly clear that Israel was using unproven artificial intelligence systems that had not undergone transparent evaluation to help in making decisions about the lives of civilians.
Jones believes that Israeli officials activated an artificial intelligence system to select targets to avoid moral responsibility.
He emphasized that the purpose of using the program is not to reduce civilian casualties, but to find more targets.
“Even the officials who run the system see AI as a killing machine. It is unlikely that Israel will stop using artificial intelligence in attacks if its allies do not put pressure on it. The situation in Gaza is genocide supported by artificial intelligence. A call for a moratorium on the use of artificial intelligence in warfare is needed,” Jones concluded.
– Habsora –
Another study published on December 1, 2023 revealed that an artificial intelligence application called “Habsora” (Gospel), which the Israeli military also used to identify targets in its attacks on the Gaza Strip, was used to precisely target civilian infrastructure and that was used in attacks against automatically generated targets. In this case, the balance of civilian victims who would die with the target was known.
“Habsora” is an artificial intelligence technology used by Israel to attack buildings and infrastructure, and “Lavander” is used when targeting people, AA writes.



