Israel’s A.I. Experiments in Gaza War Raise Ethical Concerns
https://www.nytimes.com/2025/04/25/technology/israel-gaza-ai.html
Posted by adasiukevich
Israel’s A.I. Experiments in Gaza War Raise Ethical Concerns
https://www.nytimes.com/2025/04/25/technology/israel-gaza-ai.html
Posted by adasiukevich
4 comments
This is a continuation of Israel treating the occupied Palestinian Territories as their personal war lab. The book “The Palestine Laboratory” documented how Israel developed and tested weapons and surveillance technologies on Palestinians and markets the results to the global arms industry. They also have no qualms selling this tech to dictators and drug cartels.
From what I can glean, and I’m not an AI expert, most of the results obtained by Lavender were hallucinations, or in other words gibberish. They widened the net enough to pretty much target whomever they wanted.
They generated lists of potential Hamas members based on contact with other known Hamas members, so if one went to a certain bakery or cafe all the patrons of that bakery made it onto the list with low probability of being Hamas but that increased with repeated contact. So, bad luck, you went to your local bakery at the same time as another Hamas member once too often? You’re dead, your family is dead, your relatives are dead.
It’s also useful for the IDF in other ways. Hamas member having an affair and visiting another building often? Declare it a Hamas HQ, add all males 13 and older who visit there to the database as potential Hamas members and then bomb it and search for more correlation.
After a while it’s just noise feeding in to more noise, while the IDF pretends to be only targeting Hamas and ex-Unit 8200 soldiers put “Responsible AI” on their resumes…
Hmmm how would a strike on a top Hamas commander kill 125 civilians who wouldn’t be allowed in a sensitive military area?
And how would a strike on a top military commander kill no other militants? Most top military commanders are under heavy armed guard.
The only options I see here are that the top military commander was alone amongst civilians disguised as a civilian, or most of the people killed in the strike were not actually civilians
> Shortly thereafter, Israel listened to Mr. Biari’s calls and tested the A.I. audio tool, which gave an approximate location for where he was making his calls. Using that information, Israel ordered airstrikes to target the area on Oct. 31, 2023, killing Mr. Biari. More than 125 civilians also died in the attack
So,
the AI tool works (not surprising)
Israel went ahead even though collateral damage was VERY high, since he was one of the planners of oct 7
Comments are closed.