‘Lavender’: The AI machine directing Israel’s bombing spree in Gaza (www.972mag.com)
from Stopthatgirl7@lemmy.world to technology@lemmy.world on 03 Apr 2024 13:45
https://lemmy.world/post/13860634

In 2021, a book titled “The Human-Machine Team: How to Create Synergy Between Human and Artificial Intelligence That Will Revolutionize Our World” was released in English under the pen name “Brigadier General Y.S.” In it, the author — a man who we confirmed to be the current commander of the elite Israeli intelligence unit 8200 — makes the case for designing a special machine that could rapidly process massive amounts of data to generate thousands of potential “targets” for military strikes in the heat of a war. Such technology, he writes, would resolve what he described as a “human bottleneck for both locating the new targets and decision-making to approve the targets.”

Such a machine, it turns out, actually exists. A new investigation by +972 Magazine and Local Call reveals that the Israeli army has developed an artificial intelligence-based program known as “Lavender,” unveiled here for the first time. According to six Israeli intelligence officers, who have all served in the army during the current war on the Gaza Strip and had first-hand involvement with the use of AI to generate targets for assassination, Lavender has played a central role in the unprecedented bombing of Palestinians, especially during the early stages of the war. In fact, according to the sources, its influence on the military’s operations was such that they essentially treated the outputs of the AI machine “as if it were a human decision.”

#technology

threaded - newest

Siegfried@lemmy.world on 03 Apr 2024 18:58 next collapse

Something so irresponsible as letting an AI decide where to drop bombs should be added to the war crime list

Rentlar@lemmy.ca on 03 Apr 2024 20:17 next collapse

There appear to be a number of systems here.

The previously reported AI system “The Gospel” tracked buildings that presumably had targets. “Lavender” is a newly discovered AI system that tracked people who had displayed similar characteristics as targets. This system tended to mark non-combatant civil workers for the Hamas government among other mistakes, but was understood to have a 90% accuracy rate from a manually acquired sample. The threshold for valid targets changed from day to day depending on how many targets the higher command wanted. A third system called “Where’s Daddy?” followed input targets so that they could efficiently find and kill them, along with their family, children and the other uninvolved families that happened to be there. This appears to be a matter of convenience for the IDF.

Intelligence personnel tended to just copy and paste (the 90% accurate) Lavender system output directly into the Where’s Data System. Typically the only check prior to strike authorization was whether the target was male, but this didn’t consider that the rest of the people in the target building would be mostly women and children, and in some instances the intended target had fled or avoided the area by the time the strike occurred.


The Nazi Party kept better records and had more oversight of their systematic genocide campaign 80 years ago.

delirious_owl@discuss.online on 06 Apr 2024 17:46 collapse

Looks like the article is not accessible on Tor. Here’s as much of the article I can paste before reaching the max char limit of Lemmy

‘Lavender’: The AI machine directing Israel’s bombing spree in Gaza

The Israeli army has marked tens of thousands of Gazans as suspects for assassination, using an AI targeting system with little human oversight and a permissive policy for casualties, +972 and Local Call reveal.

By Yuval Abraham | April 3, 2024

In 2021, a book titled “The Human-Machine Team: How to Create Synergy Between Human and Artificial Intelligence That Will Revolutionize Our World” was released in English under the pen name “Brigadier General Y.S.” In it, the author — a man who we confirmed to be the current commander of the elite Israeli intelligence unit 8200 — makes the case for designing a special machine that could rapidly process massive amounts of data to generate thousands of potential “targets” for military strikes in the heat of a war. Such technology, he writes, would resolve what he described as a “human bottleneck for both locating the new targets and decision-making to approve the targets.”

Such a machine, it turns out, actually exists. A new investigation by +972 Magazine and Local Call reveals that the Israeli army has developed an artificial intelligence-based program known as “Lavender,” unveiled here for the first time. According to six Israeli intelligence officers, who have all served in the army during the current war on the Gaza Strip and had first-hand involvement with the use of AI to generate targets for assassination, Lavender has played a central role in the unprecedented bombing of Palestinians, especially during the early stages of the war. In fact, according to the sources, its influence on the military’s operations was such that they essentially treated the outputs of the AI machine “as if it were a human decision.”

Formally, the Lavender system is designed to mark all suspected operatives in the military wings of Hamas and Palestinian Islamic Jihad (PIJ), including low-ranking ones, as potential bombing targets. The sources told +972 and Local Call that, during the first weeks of the war, the army almost completely relied on Lavender, which clocked as many as 37,000 Palestinians as suspected militants — and their homes — for possible air strikes.

During the early stages of the war, the army gave sweeping approval for officers to adopt Lavender’s kill lists, with no requirement to thoroughly check why the machine made those choices or to examine the raw intelligence data on which they were based. One source stated that human personnel often served only as a “rubber stamp” for the machine’s decisions, adding that, normally, they would personally devote only about “20 seconds” to each target before authorizing a bombing — just to make sure the Lavender-marked target is male. This was despite knowing that the system makes what are regarded as “errors” in approximately 10 percent of cases, and is known to occasionally mark individuals who have merely a loose connection to militant groups, or no connection at all.

Moreover, the Israeli army systematically attacked the targeted individuals while they were in their homes — usually at night while their whole families were present — rather than during the course of military activity. According to the sources, this was because, from what they regarded as an intelligence standpoint, it was easier to locate the individuals in their private houses. Additional automated systems, including one called “Where’s Daddy?” also revealed here for the first time, were used specifically to track the targeted individuals and carry out bombings when they had entered their family’s residences.

<img alt="Palestinians transport the wounded and try to put out a fire after an Israeli airstrike on a house in the Shaboura refugee camp in the city of Rafah, southern Gaza Strip, November 17, 2023. (Abed Rahim Khatib/Flash90)" src="https://web.archive.org/web/20240406113521im_/https://static.972mag.com/www/uploads/2024/04/F231117ARK97-1-1280x853.jpg">
Palestinians transport the wounded and try to put out a fire after an Israeli airstrike on a house in the Shaboura refugee camp in the city of Rafah, southern Gaza Strip, November 17, 2023. (Abed Rahim Khatib/Flash90)

The result, as the sources testified, is that thousands of Palestinians — most of them women and children or people who were not involved in the fighting — were wiped out by Israeli airstrikes, especially during