2 new recruits help Israel track Hamas activists in Gaza. They are both AI

While Lavender AI identifies human targets, Gospel attacks structures and buildings.

New Delhi:

Reports have emerged claiming the use of advanced artificial intelligence (AI) systems by the Israeli military in its bombing campaign on Gaza. These systems, named Lavender and Gospel, have played a central role in the IDF’s targeting strategy, sparking debate about the ethical and legal implications of their deployment.

What is Lavender AI?

Lavender, developed by Israel’s elite intelligence division, Unit 8200, serves as an AI-powered database designed to identify potential targets associated with Hamas and Palestinian Islamic Jihad (PIJ). Lavender uses machine learning algorithms and processes large amounts of data to pinpoint individuals deemed as “junior” terrorists within these armed groups.

Lavender initially identified 37,000 Palestinian men affiliated with Hamas or PIJ, Israeli-Palestinian publication +972 Magazine and Hebrew-language outlet Local Call reported. The use of AI to identify targets marks a significant change in the way Israeli intelligence, the Mossad and Shin Bet, function – which rely on more labor-intensive human decision-making.

Soldiers often made decisions in two seconds based on lavender information, with it taking at least 20 seconds to determine whether or not to bomb these identified targets, primarily to determine the gender of the target. Despite the AI ​​program’s error margin being up to 10 percent, human soldiers often followed the machine’s information unquestioningly, meaning it could be up to 10 percent wrong. According to reports, the program often targets individuals with minimal or no affiliation with Hamas.

What is gospel AI?

Gospel is another AI system that operates by automatically generating goals based on AI recommendations. Unlike Lavender who identifies human targets, The Gospel reportedly identifies structures and buildings as targets.

“It is a system that allows the preparation of targets at a faster pace by the use of automated tools and works by improving accurate and high-quality intelligence material as per the need. With the help of artificial intelligence, and faster and “Updated intelligence through automated extraction – this generates a recommendation for the researcher, with the goal that there will be a perfect match between the machine’s recommendation and the identification made by a person,” the IDF said in a statement.

The specific data sources given in The Gospels are unknown. However, experts suggest that AI-powered targeting systems typically analyze diverse data sets, including drone imagery, intercepted communications, surveillance data, and behavioral patterns of individuals and groups.

Ethical and legal concerns

The use of lavender and gospel in Israel’s bombing campaign represents a significant advancement in the intersection of AI and modern warfare, but also raises ethical and legal concerns. Although these technologies offer potential benefits in target identification and operational efficiency, their deployment creates ethical and legal dilemmas.