Israel’s military has been using artificial intelligence to help choose its bombing targets in Gaza, sacrificing accuracy in favor of speed and killing thousands of civilians in the process, according to an investigation by Israel-based publications +972 Magazine and Local Call.
Report: Israel used AI to identify bombing targets in Gaza


The system, called Lavender, was developed in the aftermath of Hamas’ October 7th attacks, the report claims. At its peak, Lavender marked 37,000 Palestinians in Gaza as suspected “Hamas militants” and authorized their assassinations.
Israel’s military denied the existence of such a kill list in a statement to +972 and Local Call. A spokesperson told CNN that AI was not being used to identify suspected terrorists but did not dispute the existence of the Lavender system, which the spokesperson described as “merely tools for analysts in the target identification process.” Analysts “must conduct independent examinations, in which they verify that the identified targets meet the relevant definitions in accordance with international law and additional restrictions stipulated in IDF directives,” the spokesperson told CNN. The Israel Defense Forces did not immediately respond to The Verge’s request for comment.
In interviews with +972 and Local Call, however, Israeli intelligence officers said they weren’t required to conduct independent examinations of the Lavender targets before bombing them but instead effectively served as “a ‘rubber stamp’ for the machine’s decisions.” In some instances, officers’ only role in the process was determining whether a target was male.
a:hover]:shadow-highlight-franklin dark:[&>a:hover]:shadow-highlight-franklin [&>a]:shadow-underline-black dark:[&>a]:shadow-underline-white”>Choosing targets
To build the Lavender system, information on known Hamas and Palestinian Islamic Jihad operatives was fed into a dataset — but, according to one source who worked with the data science team that trained Lavender, so was data on people loosely affiliated with Hamas, such as employees of Gaza’s Internal Security Ministry. “I was bothered by the fact that when Lavender was trained, they used the term ‘Hamas operative’ loosely, and included people who were civil defense workers in the training dataset,” the source told +972.
Lavender was trained to identify “features” associated with Hamas operatives, including being in a WhatsApp group with a known militant, changing cellphones every few months, or changing addresses frequently. That data was then used to rank other Palestinians in Gaza on a 1–100 scale based on how similar they were to the known Hamas operatives in the initial dataset. People who reached a certain threshold were then marked as targets for strikes. That threshold was always changing “because it depends on where you set the bar of what a Hamas operative is,” one military source told +972.
The system had a 90 percent accuracy rate, sources said, meaning that about 10 percent of the people identified as Hamas operatives weren’t members of Hamas’ military wing at all. Some of the people Lavender flagged as targets just happened to have names or nicknames identical to those of known Hamas operatives; others were Hamas operatives’ relatives or people who used phones that had once belonged to a Hamas militant. “Mistakes were treated statistically,” a source who used Lavender told +972. “Because of the scope and magnitude, the protocol was that even if you don’t know for sure that the machine is right, you know statistically that it’s fine. So you go for it.”
a:hover]:shadow-highlight-franklin dark:[&>a:hover]:shadow-highlight-franklin [&>a]:shadow-underline-black dark:[&>a]:shadow-underline-white”>Collateral damage
Intelligence officers were given wide latitude when it came to civilian casualties, sources told +972. During the first few weeks of the war, officers were allowed to kill up to 15 or 20 civilians for every lower-level Hamas operative targeted by Lavender; for senior Hamas officials, the military authorized “hundreds” of collateral civilian casualties, the report claims.
Suspected Hamas operatives were also targeted in their homes using a system called “Where’s Daddy?” officers told +972. That system put targets generated by Lavender under ongoing surveillance, tracking them until they reached their homes — at which point, they’d be bombed, often alongside their entire families, officers said. At times, however, officers would bomb homes without verifying that the targets were inside, wiping out scores of civilians in the process. “It happened to me many times that we attacked a house, but the person wasn’t even home,” one source told +972. “The result is that you killed a family for no reason.”
a:hover]:shadow-highlight-franklin dark:[&>a:hover]:shadow-highlight-franklin [&>a]:shadow-underline-black dark:[&>a]:shadow-underline-white”>AI-driven warfare
Mona Shtaya, a non-resident fellow at the Tahrir Institute for Middle East Policy, told The Verge that the Lavender system is an extension of Israel’s use of surveillance technologies on Palestinians in both the Gaza Strip and the West Bank.
Shtaya, who is based in the West Bank, told The Verge that these tools are particularly troubling in light of reports that Israeli defense startups are hoping to export their battle-tested technology abroad.
Since Israel’s ground offensive in Gaza began, the Israeli military has relied on and developed a host of technologies to identify and target suspected Hamas operatives. In March, The New York Times reported that Israel deployed a mass facial recognition program in the Gaza Strip — creating a database of Palestinians without their knowledge or consent — which the military then used to identify suspected Hamas operatives. In one instance, the facial recognition tool identified Palestinian poet Mosab Abu Toha as a suspected Hamas operative. Abu Toha was detained for two days in an Israeli prison, where he was beaten and interrogated before being returned to Gaza.
Another AI system, called “The Gospel,” was used to mark buildings or structures that Hamas is believed to operate from. According to a +972 and Local Call report from November, The Gospel also contributed to vast numbers of civilian casualties. “When a 3-year-old girl is killed in a home in Gaza, it’s because someone in the army decided it wasn’t a big deal for her to be killed — that it was a price worth paying in order to hit [another] target,” a military source told the publications at the time.
“We need to look at this as a continuation of the collective punishment policies that have been weaponized against Palestinians for decades now,” Shtaya said. “We need to make sure that war times are not used to justify the mass surveillance and mass killing of people, especially civilians, in places like Gaza.”
Israel’s military has been using artificial intelligence to help choose its bombing targets in Gaza, sacrificing accuracy in favor of speed and killing thousands of civilians in the process, according to an investigation by Israel-based publications +972 Magazine and Local Call. The system, called Lavender, was developed in the aftermath…
Recent Posts
- Over a million clinical records exposed in data breach
- Rabbit AI’s new tool can control your Android phones, but I’m not sure how I feel about letting it control my smartphone
- Everything missing from the iPhone 16e, including MagSafe and Photographic Styles
- Reddit is reportedly experiencing some outages
- Google may be close to launching YouTube Premium Lite
Archives
- February 2025
- January 2025
- December 2024
- November 2024
- October 2024
- September 2024
- August 2024
- July 2024
- June 2024
- May 2024
- April 2024
- March 2024
- February 2024
- January 2024
- December 2023
- November 2023
- October 2023
- September 2023
- August 2023
- July 2023
- June 2023
- May 2023
- April 2023
- March 2023
- February 2023
- January 2023
- December 2022
- November 2022
- October 2022
- September 2022
- August 2022
- July 2022
- June 2022
- May 2022
- April 2022
- March 2022
- February 2022
- January 2022
- December 2021
- November 2021
- October 2021
- September 2021
- August 2021
- July 2021
- June 2021
- May 2021
- April 2021
- March 2021
- February 2021
- January 2021
- December 2020
- November 2020
- October 2020
- September 2020
- August 2020
- July 2020
- June 2020
- May 2020
- April 2020
- March 2020
- February 2020
- January 2020
- December 2019
- November 2019
- September 2018
- October 2017
- December 2011
- August 2010