Gaza war: artificial intelligence is changing the speed of targeting and scale of civilian harm in unprecedented ways
The Israeli campaign in the Gaza Strip is described by experts as one of the most "relentless and deadly" in recent history and is also being coordinated in part with artificial intelligence (AI).
This is being used for any action in war, from identifying and prioritizing targets to assigning weapons to be used against those targets.
The AI systems Israel employs are trained to recognize traits believed to characterize people associated with Hamas's military wing, like being a member of the same WhatsApp group as a known militant, if you change your phone number and address every few months...
The systems analyze the data of all Gazans, thus allowing the algorithms to "predict" the probability that a person is a member of Hamas, that a building houses that person, or that that person has entered his house.
In this way, the program has managed to identify many more objectives, from 50 per year to 100 in one day, and has considerably reduced deliberation time.
The Lavender AI decision system identified 37,000 people as potential human targets. According to the IDF, the accuracy rate of this system reached 90%, although the details of manual verification of the system are classified.
That 10% inaccuracy is very important, since in the end it is a system that is used to decide whether those 37,000 targets live or die, so the consequences are potentially destructive for civilians.
An Israeli army officer commented that delegating decision-making trust to AI is seamless:
“Because of the scope and magnitude, the protocol was that even if you weren't sure the machine was okay, you knew it was statistically okay. So do it,” he declared.
It should be noted that the bosses who consider the Palestinians of Gaza as targets will not take into account that if their homes are destroyed, the population will inevitably change their address or give their phones to their loved ones (two of the conditions that the AI considers them to be suspected of belonging to Hamas). In this way, any civilian who has gone through these situations has become a suspect for the AI program.
The article reflects on the fact that AI should be considered in these cases "alters the interactions between humans, machines and humans, where those who execute algorithmic violence simply approve the result generated by the artificial intelligence system, and those who suffer violence are dehumanized in unprecedented ways."