Israel killing “unimportant people.” The IDF has been using artificial intelligence to create target lists that were used by jets, drones and ground forces to kill Palestinian families. Read it to believe it. It’s worse than I’m describing it.
AMY GOODMAN: The Israeli Publications PLUS 972 magazine and LOCAL CALL have exposed how the Israeli military used artificial intelligence known as “LAVENDER” to develop a kill list in Gaza that includes as many as 37,000 Palestinians who are targeted for assassination with little human oversight.
The report is based in part on interviews with six Israeli intelligence officers who had firsthand involvement with the AI system.
972 reports quote “LAVENDER” has played a central role in the unprecedented bombing of Palestinians especially during the early stages of the war; in fact, according to the sources its influence on the military’s operations was such that they essentially treated the outputs of the AI machine as if it were a human decision.
A second AI system known as “WHERE’S DADDY?” tracked Palestinian men on the kill list.
It was purposely designed to help Israel target individuals when they were at home at night with their families.
One intelligence officer told the publications, quote: “We were not interested in killing operatives only when they were in a military building or engaged in military activity; on the contrary, the IDF bombed them in homes without hesitation as a first option. It’s much easier to bomb a family’s home.
The system is built to look for them in these situations.”
Today we spend the hour with the Israeli investigative journalist Yuval Abraham who broke the story for 972 to and LOCAL CALL. It’s headlined “LAVENDER, the AI machine directing Israel’s bombing spree in Gaza.”
I spoke with Yuval Abraham yesterday and began by asking him to lay out what he found.
Yuval Abraham: “Thank you for for having me again Amy, it is a very long piece, it’s 8,000 words and we divided it into six different steps and and each step represents a process in the highly automated way in which the military marked targets since October 7 and the first finding is LAVENDER.
LAVENDER was designed by the military. Its purpose was when it was being designed to mark the low ranking operatives in the Hamas and Islamic Jihad military wings. That was the intention. Because you know Israel estimates that there are between 30 to 40,000 Hamas operatives – and it’s a very large number -and they understood that the only way for them to mark these people is by relying on artificial intelligence; and that was the intention.
Now what sources told me, is that after October 7th the military basically made a decision that all of these tens of thousands of people are now people that could potentially be bombed inside their houses – meaning not only killing them, but everybody who was in the building: the children, the families and they understood that in order to try to attempt to do that they are going to have to rely on this AI machine called LAVENDER with very minimal human supervision.
I mean one source said that he felt he was acting as a rubber stamp on the machine’s decisions.
Now what LAVENDER does is, it scans information on probably 90% of the population of Gaza; so we’re talking about, you know, more than a million people; and it gives each individual a rating between one to 100 — a rating that is an expression of the likelihood that the machine thinks – based on a list of small features and we can get to that later – that that individual is a member of the Hamas or Islamic Jihad military wings.
Sources told me that the military knew, because they checked they took a random sampling and checked one by one – the military knew that approximately 10% of the people that the machine was marking to be killed were not Hamas militants; some of them had a loose connection to Hamas – others had completely no connection to Hamas.
One source said how the machine would bring people who had the exact same name and nickname as a Hamas operative or people who had similar communication profiles; like it could be civil defense workers, police officers in Gaza; and they implemented again minimal supervision on the machine.
One source said that he spent 20 seconds per target before authorizing the bombing of the alleged low-ranking Hamas militant.
Often it was also would have been a civilian killing those people inside their houses.
And I think this the reliance on artificial intelligence here to mark those targets and basically the deadly way in which the officers spoke about how they were using the machine, could very well be part of the reason why in the first, you know six weeks after October 7th, like one of the main characteristics of the policies that were in place were entire Palestinian families being wiped out inside their houses.
I mean if you look at UN statistics more than 50% of the casualties, more than 6,000 people at that time came from a smaller group of families; it’s an expression of, you know, the family unit being destroyed.
And I think that machine and the way it was used led to that.”
AMY GOODMAN: “You talk about the choosing of targets and you talk about the so-called high value targets Hamas commanders and then the lower level fighters, and as you said many of them in the end it wasn’t either uh, but explain the buildings that were targeted and the bombs that were used to target them.”
YUVAL : “Yeah it’s a good question, so what sources told me is that during those first weeks after October for the low ranking militants in Hamas – many of whom were marked by LAVENDER – so we can say alleged militants that were marked by the machine, they had a predetermined what they call collateral damage degree; and this means that the military’s international law Department told these Intelligence Officers that for each low ranking target that LAVENDER marks, when bombing that target they are allowed to kill up to 20 civilians; again for any Hamas operative, regardless of rank regardless of importance of age.
One source said that there were also minors being marked, not many of them, but he said that was a possibility, that there was no age limit; and another source said that the limit was up to 15 civilians for the low ranking militants.
The sources said that for senior commanders in Hamas – so it could be you know, commanders of brigades or divisions or battalions; the numbers were for the first time in the IDF’s history in the triple digits, according to sources; so for example, Iman Nofel, who was the Hamas commander of the central Brigade, a source that took part in the strike against that person said that the military authorized to kill alongside that person 300 Palestinian civilians and we’ve spoken at plus 972 Local Call with Palestinians who were witnesses of that strike and they speak about, you know, four quite large residential buildings being bombed on that day – you know, entire apartments filled with families being bombed and killed; and that source told me that this was not some mistake like the amount of civilian of this 300 civilian, it was known beforehand to the Israeli military, and sources describe that to me and they said that.
I mean, one source said that during those weeks at the beginning, effectively the principle of proportionality as they call it under international law, quote, ‘Did not exist.'”
AMY GOODMAN: “So there’s two programs: there’s LAVENDER and there’s “WHERE’S DADDY.” How did they even know where these men were? innocent or not.”
YUVAL: “Yeah, so the way the system was designed is, there is this concept in generally in systems of mass surveillance called linking – when you want to automate these systems you want to be able to very quickly you know you get for example an ID of a person and he wants to have a computer be very quickly able to link that ID to other stuff; and what sources told me is that since everybody in Gaza has a home as a house or at least that was the case in the past, the system was designed to be able to automatically link between individuals and houses; and in the majority of cases these households that are linked to the individuals that LAVENDER is marking as low ranking militants, are not places where there is active military action taking place according to sources; yet the way the system was designed and programs like WHERE’S DADDY which were designed to search for these low ranking militants when they enter houses; specifically, it sends an alert to the Intelligence Officers when these AI marked suspects enter their houses.
The system was designed in a way that allowed the Israeli military to carry out massive strikes against Palestinians – sometimes militants, sometimes alleged militants – we don’t know – when they were in these spaces, in these houses; and the sources said, you know, CNN reported in December that 45% of the munitions, according to US intelligence assessments that Israel dropped on Gaza, were unguided, so-called “DUMB BOMBS” that have, you know, a larger damage to civilians.
They destroy the entire structure; and sources said that for these low ranking operatives in Hamas they were only using the dumb munitions; meaning they were collapsing the houses on everybody inside, and when you ask Intelligence Officers why? one explanation they give is that these people were, quote, ‘UNIMPORTANT’ they were not important enough from a military perspective that the Israeli Army would waste expensive munitions, meaning more guided floor bombs, that could have maybe taken just a particular floor in the building.
To me that was very striking, because you know you’re dropping a bomb on a house and killing entire families, yet the target that you are aiming to assassinate by doing so is not considered important enough, to quote, ‘Waste an expensive bomb on.’
I think it’s a very rare reflection of sort of the way the Israeli military measures the value of Palestinian lives in relation to expected military gain, which is the principle of proportionality.
One thing that was very clear from all the sources that I spoke with, is that this was psychologically shocking, even for them.
So that’s the combination between LAVENDER and WHERE’S DADDY.
The LAVENDER lists are fed into WHERE’S DADDY and these systems track the suspects and wait for the moments that they enter houses – usually family houses or households where no military action takes place. These houses are bombed using unguided missiles. This was a main characteristic of Israeli policy in Gaza, at least for for the first weeks.”
AMY GOODMAN: “You write that they said they didn’t have as many smart bombs, they were more expensive so they didn’t want to waste them, so they use the dumb bombs which kill so many more.”
YUVAL: “Yeah exactly that’s what they said but then I say if the person is you know is not important enough for you to to to waste ammunition on, but you’re willing to kill 15 civilians a family?”
AMY GOODMAN: “Yuval Abraham, I wanted to read from the Israeli military statement the IDF statement in response to your report. They say, ‘The process of identifying military targets in the IDF consists of various types of tools and methods: including Information Management tools which are used in order to help the intelligence analysts to gather and optimally analyze the intelligence obtained from a variety of sources. Contrary to claims, the IDF does not use an artificial intelligence system that identifies terrorist operatives or tries to predict whether a person is a terrorist.
Information systems are merely tools for analysts in the target identification process. Again, that’s the IDF response Yuval Abraham to your report – your response:”
YUVAL: “I read this response to to some of the sources and they said that they’re lying, that it’s not true.
I was surprised that they were so blatant in saying something that is false.
I think this can very easily be disproven because a senior ranking Israeli military official – the head of the 8200 units AI Center – gave a public lecture in last year in 2023 in Tel Aviv University.
You can Google it. Anybody who’s listening to us, where he spoke about (I’m quoting him in that lecture) an AI system that the Israeli military used in 2021 to find terrorists.
That’s what he said on record. I have the presentation slides showing how the system is rating the people, and then to get a comment from the IDF spokesperson saying we do not have a system that uses AI to I
The commander of the 8200 unit wrote a book in 2012 titled “Human Machine Teams” how Synergy between AI and human beings can revolutionize the world; and in the book he’s talking about how militaries should rely on artificial intelligence to ‘solve the problem of the human bottleneck in creating new targets and in the decision-making process to approve new targets.’
He said that no matter how many intelligence officers you have tasked with producing targets during the war, they still will not be able to produce enough targets per day; and he gives a guide in that book as to how to build these AI systems.
Now I want to emphasize you know he writes in the book very clearly that these systems are not supposed to replace human judgment – he calls it ‘Mutual Learning’ between humans and artificial intelligence.
He says the IDF still maintains this – they say it is intelligence officers who look at the results and make a decision.
From what I heard from numerous sources, after October 7th that stopped being the case – at least in some parts of the IDF where again Amy, as I said before, sources were told that if they check that the target is a male, they can accept LAVENDER’s recommendations without thoroughly looking at them, without checking why the machine made the decision that it made.
I think and and I felt this also in the previous piece that I wrote the mass assassination factory, which spoke about another Israeli AI machine called ‘THE GOSPEL.’
They they felt a need to share this information with the world out of a sense that people are not getting it.
You know, they’re they’re hearing the military spokesperson and all of these narratives that we’ve been hearing for the past six months and they do not reflect the the reality on the ground.
I really believe there’s a looming attack now on RAFAH.
These systems could be used there again to kill Palestinians in massive numbers.