Israel officer on AI use in 2021 Gaza offensive: 'Helped break human barrier'

1 month ago

Amid recent reports of the Israeli military using an artificial intelligence-powered tool, called Lavender, to identify bombing targets in Gaza, a year-old video has surfaced on social media in which an official speaks on how the country has been using machine learning to identify targets.

Even though the Israel Defence Forces (IDF) have denied that AI was being used to identify suspected terrorists, the official at Israel's cyber intelligence agency details how machine learning techniques were used during the 2021 offensive in Gaza, The Guardian reported.

Citing an example of "one of the tools", the official, named 'Colonel Yoav', said, "Let's say we have some terrorists that form a group, and we know only some of them... By practising our data science magic powder we are able to find the rest of them."

The video was shot at a conference at Tel Aviv University in February 2023. Interestingly, the gathering was instructed not to take any photos of the official or record his presentation.

The official, part of Unit 8200, said his unit had used machine learning to find Hamas squad missile commanders and anti-tank missile terrorists in Gaza during IDF's military operation in May 2021.

"We take the original sub-group, we calculate their close circles, we then calculate relevant features, and at last we rank the results and determine the threshold," The Guardian quoted the intel official as saying.

The colonel said feedback from intelligence officers was used to enrich and improve the algorithm. He, however, underscored that "people of flesh and blood" make the decisions. "These tools are meant to help break the human barrier," the Israeli official further said.

The intel officer said his unit managed to produce more than 200 new targets. Speaking on the benefits of the AI tool, he said, "Suddenly you can react during battle with applied data-science-driven solutions".

The description by the colonel bears similarities with the recent revelation by six Israeli intelligence officials to +972 Magazine and a Hebrew-language media outlet.

The six IDF officials said an AI-based tool called "Lavender", which had a 10% error rate, was used to assist intelligence officers involved in the bombing campaign in Gaza and identify tens of thousands of potential human targets.

The IDF, however, said some of the accounts were "baseless". The IDF denied AI was being used to identify suspected terrorists even though it did not dispute the existence of the tool.

Published By:

Abhishek De

Published On:

Apr 12, 2024

Read Full Article at Source