Maybe this?
In April 2024, left-wing Israeli news magazine +972 unveiled just one of the processes by which the Israeli military has been facilitating large-scale killing in Gaza. It involves an AI program with the code-name “Lavender” that gives IDF officers recommendations on which city blocks to level. After feeding endless lists of Gaza’s residents into this program (which, like other AI models, is known to produce hallucinations and abject falsehoods), officers are ordered to treat its output as unassailable fact. What was this output? A “kill list” of Palestinians, ranked by AI.
Lavender’s database of Gazans is sorted by what the program thinks their likelihood (from 1 to 100) is of being a member of Hamas or Palestinian Islamic Jihad, the armed groups Israel has said are the targets of their war on Gaza. These rankings, conjured up by a notoriously error-prone technology, reportedly take into account Gazan residents’ WhatsApp records, phone number changes, and frequency of moves between addresses. But Lavender is nowhere near perfect, and is known to falsely identify civilians as militants at least 10 percent of the time.
Despite this, according to +972, Israeli officers only take an average of 20 seconds to review an AI-identified target before authorizing a strike on their residence. This time is sufficient for determining if the target is a man, but not for any other fact-checking. Considering 10 percent of these targets are false positives, this practice represents willful negligence of the highest order, and reveals a disgusting disregard for human life. All males with a high enough Lavender ranking are treated as confirmed militants, and thus are deemed deserving of death by airstrike — not just for them but for their family and neighbors.
https://bpr.studentorg.berkeley.edu/2026/02/13/lavender-ai-palantir-and-the-israelification-of-homeland-security/