Search
AI-powered search, human-powered content.
scroll to top arrow or icon

{{ subpage.title }}

Israel’s lethal AI

The Israeli military is using artificial intelligence to determine bombing targets with cursory oversight from humans, according to reports from The Guardian and +972 Magazine last week.

The reports cite anonymous Israeli intelligence officials, who say an AI program called Lavender is trained to identify Hamas and Palestinian Islamic Jihad militants as potential bombing targets. The government has reportedly given Israel Defense Forces officers approval to take out anyone identified as a target by Lavender. The tool has been used to order strikes on “thousands” of Palestinian targets — even though Lavender is known to have a 10% error rate. According to the Guardian, 37,000 potential targets were identified by the program.

Read moreShow less

Subscribe to our free newsletter, GZERO Daily

Latest