Welcome to DU! The truly grassroots left-of-center political community where regular people, not algorithms, drive the discussions and set the standards. Join the community: Create a free account Support DU (and get rid of ads!): Become a Star Member Latest Breaking News Editorials & Other Articles General Discussion The DU Lounge All Forums Issue Forums Culture Forums Alliance Forums Region Forums Support Forums Help & Search

Eugene

(62,767 posts)
Thu Apr 4, 2024, 04:35 AM Apr 2024

'Lavender': The AI machine directing Israel's bombing spree in Gaza

Source: +972 Magazine

‘Lavender’: The AI machine directing Israel’s bombing spree in Gaza

The Israeli army has marked tens of thousands of Gazans as suspects for assassination, using an AI targeting system with little human oversight and a permissive policy for casualties, +972 and Local Call reveal.

By Yuval Abraham | April 3, 2024

In 2021, a book titled “The Human-Machine Team: How to Create Synergy Between Human and Artificial Intelligence That Will Revolutionize Our World” was released in English under the pen name “Brigadier General Y.S.” In it, the author — a man who we confirmed to be the current commander of the elite Israeli intelligence unit 8200 — makes the case for designing a special machine that could rapidly process massive amounts of data to generate thousands of potential “targets” for military strikes in the heat of a war. Such technology, he writes, would resolve what he described as a “human bottleneck for both locating the new targets and decision-making to approve the targets.”

Such a machine, it turns out, actually exists. A new investigation by +972 Magazine and Local Call reveals that the Israeli army has developed an artificial intelligence-based program known as “Lavender,” unveiled here for the first time. According to six Israeli intelligence officers, who have all served in the army during the current war on the Gaza Strip and had first-hand involvement with the use of AI to generate targets for assassination, Lavender has played a central role in the unprecedented bombing of Palestinians, especially during the early stages of the war. In fact, according to the sources, its influence on the military’s operations was such that they essentially treated the outputs of the AI machine “as if it were a human decision.”

Formally, the Lavender system is designed to mark all suspected operatives in the military wings of Hamas and Palestinian Islamic Jihad (PIJ), including low-ranking ones, as potential bombing targets. The sources told +972 and Local Call that, during the first weeks of the war, the army almost completely relied on Lavender, which clocked as many as 37,000 Palestinians as suspected militants — and their homes — for possible air strikes.

During the early stages of the war, the army gave sweeping approval for officers to adopt Lavender’s kill lists, with no requirement to thoroughly check why the machine made those choices or to examine the raw intelligence data on which they were based. One source stated that human personnel often served only as a “rubber stamp” for the machine’s decisions, adding that, normally, they would personally devote only about “20 seconds” to each target before authorizing a bombing — just to make sure the Lavender-marked target is male. This was despite knowing that the system makes what are regarded as “errors” in approximately 10 percent of cases, and is known to occasionally mark individuals who have merely a loose connection to militant groups, or no connection at all.

-snip-

Read more: https://www.972mag.com/lavender-ai-israeli-army-gaza/

________________________________________________

Related:
‘The machine did it coldly’: Israel used AI to identify 37,000 Hamas targets (The Guardian)
Early on in the war, IDF gave clearance to allow 20 civilian deaths for every low-ranking Hamas suspect, intelligence sources said: report (Business Insider)

2 replies = new reply since forum marked as read
Highlight: NoneDon't highlight anything 5 newestHighlight 5 most recent replies
'Lavender': The AI machine directing Israel's bombing spree in Gaza (Original Post) Eugene Apr 2024 OP
Was just reading The Guardian article. Chilling erronis Apr 2024 #1
The IDF response Mosby Apr 2024 #2

erronis

(17,174 posts)
1. Was just reading The Guardian article. Chilling
Thu Apr 4, 2024, 06:58 AM
Apr 2024
https://www.theguardian.com/world/2024/apr/03/israel-gaza-ai-database-hamas-airstrikes

The Israeli military’s bombing campaign in Gaza used a previously undisclosed AI-powered database that at one stage identified 37,000 potential targets based on their apparent links to Hamas, according to intelligence sources involved in the war.

In addition to talking about their use of the AI system, called Lavender, the intelligence sources claim that Israeli military officials permitted large numbers of Palestinian civilians to be killed, particularly during the early weeks and months of the conflict.

Their unusually candid testimony provides a rare glimpse into the first-hand experiences of Israeli intelligence officials who have been using machine-learning systems to help identify targets during the six-month war.

Israel’s use of powerful AI systems in its war on Hamas has entered uncharted territory for advanced warfare, raising a host of legal and moral questions, and transforming the relationship between military personnel and machines.

Mosby

(17,637 posts)
2. The IDF response
Thu Apr 4, 2024, 07:01 PM
Apr 2024

Some of the claims portrayed in your questions are baseless in fact, while others reflect a flawed understanding of IDF directives and international law. Following the murderous attack by the Hamas terror organization on October 7, the IDF has been operating to dismantle Hamas’ military capabilities.

The Hamas terrorist organization places, as a method of operation, its operatives, and military assets in the heart of the civilian population. It makes systematic use of the civilian population as a human shield, and conducts combat from within ostensibly civilian buildings, including residential buildings, hospitals, mosques, schools, and UN facilities. Contrary to Hamas, the IDF is committed to international law and acts accordingly. As such, the IDF directs its strikes only towards military targets and military operatives and carries out strikes in accordance with the rules of proportionality and precautions in attacks. Exceptional incidents undergo thorough examinations and investigations.

The process of identifying military targets in the IDF consists of various types of tools and methods, including information management tools, which are used in order to help the intelligence analysts to gather and optimally analyze the intelligence, obtained from a variety of sources. Contrary to claims, the IDF does not use an artificial intelligence system that identifies terrorist operatives or tries to predict whether a person is a terrorist. Information systems are merely tools for analysts in the target identification process. According to IDF directives, analysts must conduct independent examinations, in which they verify that the identified targets meet the relevant definitions in accordance with international law and additional restrictions stipulated in the IDF directives.

The “system” your questions refer to is not a system, but simply a database whose purpose is to cross-reference intelligence sources, in order to produce up-to-date layers of information on the military operatives of terrorist organizations. This is not a list of confirmed military operatives eligible to attack.

According to international humanitarian law, a person who is identified as a member of an organized armed group (like the Hamas’ military wing), or a person who directly participates in hostilities, is considered a lawful target. This legal rule is reflected in the policy of all law-abiding countries, including the IDF’s legal practice and policy, which did not change during the course of the war.

For each target, IDF procedures require conducting an individual assessment of the anticipated military advantage and collateral damage expected. Such assessments are not made categorically in relation to the approval of individual strikes. The assessment of the collateral damage expected from a strike is based on a variety of assessment methods and intelligence-gathering measures, in order to achieve the most accurate assessment possible, considering the relevant operational circumstances. The IDF does not carry out strikes when the expected collateral damage from the strike is excessive in relation to the military advantage. In accordance with the rules of international law, the assessment of the proportionality of a strike is conducted by the commanders on the basis of all the information available to them before the strike, and naturally not on the basis of its results in hindsight.

As for the manner of carrying out the strikes – the IDF makes various efforts to reduce harm to civilians to the extent feasible in the operational circumstances ruling at the time of the strike.

In this regard, the IDF reviews targets before strikes and chooses the proper munition in accordance with operational and humanitarian considerations, taking into account an assessment of the relevant structural and geographical features of the target, the target’s environment, possible effects on nearby civilians, critical infrastructure in the vicinity, and more. Aerial munitions without an integrated precision-guide kit are standard weaponry in developed militaries worldwide. The IDF uses such munitions while employing onboard aircraft systems to calculate a specific release point to ensure a high level of precision, used by trained pilots. In any event, the clear majority of munitions used in strikes are precision-guided munitions.

The IDF outright rejects the claim regarding any policy to kill tens of thousands of people in their homes.


https://www.theguardian.com/world/2024/apr/03/israel-defence-forces-response-to-claims-about-use-of-lavender-ai-database-in-gaza

Latest Discussions»Issue Forums»Israel/Palestine»'Lavender': The AI machin...