Anonymous ID: 6372b7 Feb. 20, 2026, 12:36 p.m. No.24284363   🗄️.is 🔗kun   >>4385

>>24284014

Maybe this?

In April 2024, left-wing Israeli news magazine +972 unveiled just one of the processes by which the Israeli military has been facilitating large-scale killing in Gaza. It involves an AI program with the code-name “Lavender” that gives IDF officers recommendations on which city blocks to level. After feeding endless lists of Gaza’s residents into this program (which, like other AI models, is known to produce hallucinations and abject falsehoods), officers are ordered to treat its output as unassailable fact. What was this output? A “kill list” of Palestinians, ranked by AI.

 

Lavender’s database of Gazans is sorted by what the program thinks their likelihood (from 1 to 100) is of being a member of Hamas or Palestinian Islamic Jihad, the armed groups Israel has said are the targets of their war on Gaza. These rankings, conjured up by a notoriously error-prone technology, reportedly take into account Gazan residents’ WhatsApp records, phone number changes, and frequency of moves between addresses. But Lavender is nowhere near perfect, and is known to falsely identify civilians as militants at least 10 percent of the time.

 

Despite this, according to +972, Israeli officers only take an average of 20 seconds to review an AI-identified target before authorizing a strike on their residence. This time is sufficient for determining if the target is a man, but not for any other fact-checking. Considering 10 percent of these targets are false positives, this practice represents willful negligence of the highest order, and reveals a disgusting disregard for human life. All males with a high enough Lavender ranking are treated as confirmed militants, and thus are deemed deserving of death by airstrike — not just for them but for their family and neighbors.

https://bpr.studentorg.berkeley.edu/2026/02/13/lavender-ai-palantir-and-the-israelification-of-homeland-security/

Anonymous ID: 6372b7 Feb. 20, 2026, 12:41 p.m. No.24284385   🗄️.is 🔗kun   >>4424

>>24284363

In 2024, Palantir made a deal with Israel “to harness Palantir’s advanced technology in support of war-related missions.” We see part of that technology today as the Lavender AI System, which assigns Palestinians a numerical score of one to a hundred based on how likely they are of being an enemy. It has a 10% error rate, which has been acceptable for Israel and Palantir.

 

And along with the Lavender System, there is the Habsora AI System, which means, “the Gospel,” which is an AI tool created to address the human issue of running out of targets, Habsora generates up to one hundred targets per day.

 

Both these AI systems use Palantir’s advanced surveillance technology to track every person’s phone records, social media, and movement patterns. Palantir’s tech is used by the US military and local police departments. It is the premier tool for mass surveillance.

https://gregreese.substack.com/p/ai-kill-and-control-system-by-palantir