AI and Drones Have Made Killing Easier, Not Less Sinful | Sojourners

AI and Drones Have Made Killing Easier, Not Less Sinful

Video footage from a drone, released on May 8, 2024, by the Israel Defense Forces, showing the place where IDF artillery and fighter jets struck over 20 Hezbollah terror targets in the area of Ramyeh in southern Lebanon. The IDF claim they struck, “military structures, and terrorist infrastructure”. They added: “During the strikes, secondary explosions were identified, indicating the presence of weapon storage facilities in the area.” Via Reuters.

A group of us stood on a hill overlooking northern Gaza this spring, not far from the border fence. We were close enough to see the buildings of Beit Hanoun and Jabalia. After a few minutes of description by our guide, we surveyed the scene with binoculars. On closer inspection, what had appeared to be buildings turned out to be rows of rubble. While for months we have viewed such images on screens, actually seeing the destruction, through plumes of smoke and dust, was surreal.

The Ministry of Health in Gaza has reported that nearly 38,000 Palestinians in Gaza have been killed and the UN reports at least 370,000 housing units in Gaza have been damaged, including 79,000 destroyed completely, which accounts for more than 70 percent of the buildings in the Gaza Strip. The level of destruction is unprecedented in recent history.

At the Christ at the Checkpoint conference in Bethlehem in late May, Rev. Munther Isaac, pastor of Christmas Lutheran Church in Bethlehem, said, “In Gaza, they have taken almost everything. But they cannot get inside and take our faith in a just and good God.”

While the mass destruction, loss of life, and use of U.S. weapons in Gaza have received significant (though insufficient) attention, there has been limited scrutiny of the use of artificial intelligence in lethal mass targeting. The developments in the use of AI in lethal targeting are ethically and theologically abhorrent, but they also set dangerous precedents that will be damaging long after the current war on Gaza stops. We must take seriously the necessity of engaging in robust ethical and theological reflection on AI, as well as work to develop legal and policy restraints on such technology. While such constraints will not match the call of the gospel of peace, they are critical and necessary practical measures to reduce harm.

Lectionary readings this spring included the meeting of Philip and the Ethiopian eunuch on the road to Gaza and the meeting of the Apostle Peter and Cornelius. In both cases, the spreading of the good — nonviolent — news of Jesus spread across former division and led to restoration and healing. The trajectory of the gospel of peace is that of greater and expanding inclusion and healing, which requires justice and repair. In an opposite trajectory, recent developments in AI targeting rapidly expand identification for destruction.

In Gaza, an Israeli AI program named “Lavender” has identified thousands of targets, with little or no meaningful human oversight. Vast lists of possible Hamas-associated individuals have been identified and targeted, along with the families in their homes, “with no requirement to thoroughly check why the machine made those choices or to examine the raw intelligence data on which they were based,” according to +972 Magazine. A November report from the same magazine highlighted AI targeting infrastructure and homes through a program called Habsora (“The Gospel”). More recently, the magazine published a detailed account of the use of both Lavender and Habsora, technology that one former intelligence officer told +972 allows the Israeli army to operate something like a “mass assassination factory.”

The technology led to the rapid expansion of targeting, according to the +972 investigation, in which “the army almost completely relied on Lavender, which clocked as many as 37,000 Palestinians as suspected militants – and their homes – for possible air strikes.” The program uses certain opaque identity markers to identify possible Hamas-associated individuals and then target them at home at night when they, along with their families, are most likely at home. This included a system called “Where’s Daddy?” The targeting while at home was not an accident or aberration but calculated and part of the design.

The development of the capacity of lethal drones and AI targeting, along with the expansion of legal and policy parameters to permit this with little restraint, have been pioneered by the U.S. and Israel. Constraints were particularly reduced in the wake of the 9/11 attacks on the U.S., according to a recent Foreign Affairs article. In his book Targeted Killing: A Legal and Political History, Markus Gunneflo wrote, “In both Israel and the United States – the two states that have pioneered this practice – targeted killing did not emerge despite, or even necessarily in opposition to, law. In any case, it emerged through extensive legal work.”

Under President Barack Obama, “signature strikes” were used as part of the vast expansion of the U.S. lethal drone program. Assassinations were authorized based on certain traits, characteristics, or actions. In 2021, The New York Times made public a series of hidden Pentagon records that revealed devastating details of this phenomenon.

U.S. and Israeli targeting policy and practice have developed in tandem with the legal justification in troubling ways. In Justice for Some, a legal history of Palestine, Noura Erakat observes, “[B]ecause of diminishing U.S. protest, which culminated in the U.S. adoption of the assassination policy, Israel’s violations steadily escaped the zone of brazen violations and moved into the scope of legitimacy.”

The use of AI in lethal operations continues to develop in similar patterns. Last August, the Pentagon announced the Replicator initiative, to “leverage autonomous systems, such as drones and other unmanned vehicles, in all domains.” But despite new technology and permissive policies, John Chappel of the Center for Civilians in Conflict, told me in an email that the “moral imperative to protect civilians and the legal requirement to comply with international humanitarian law remain unchanged by the deployment of artificial intelligence in conflict settings. In fact, the speed and scale at which AI systems can operate and multiply harm heightens the urgency of abiding by these principles.”

The Church of the Brethren’s 2013 Resolution Against Drone Warfare, to which I contributed, proclaimed, “All killing mocks the God who creates and gives life. Jesus, as the Word incarnate, came to dwell among us (John 1:14) in order to reconcile humanity to God and bring about peace and healing.” This opposition to violence moves from the teaching of Jesus to the very core of the healing work of the Incarnation. My church has taken a clear and robust stance against the use of violence, and this concern is held by many beyond the historic peace church tradition.

Even for Christian traditions that assert that lethal force sometimes may be justified, there is agreement that it is never merely a matter of efficiency or a technical issue. Such violence does not align with Jesus’s call to love neighbor and enemy; it also fundamentally reorients our trust, misdirecting it away from God and toward the alleged omniscience of this eye of death.

While visiting the site of one of the Oct. 7 attacks in southern Israel, we could hear Israeli artillery, bombs, drones, and machine gun fire from helicopters — and at one point we needed to take cover when a “red alert” warned of incoming rocket fire from Hamas. After visiting the homes destroyed by Hamas and hearing of those killed, our host said that Hamas keeps developing weapons and Israel keeps developing weapons, “and where are we? I know that my safety and well-being, and for my children, needs [Palestinians] to also have safety and well-being as well.”

While the use of AI in targeting in Gaza is a new and dangerous development, this comes in the context of the ongoing occupation, surveillance, and militarized control that includes the West Bank, East Jerusalem, and Palestinian residents in Israel. The broader context and history must not be ignored.

“Hope gives us the power to imagine,” Lamma Mansour, a Palestinian Christian from Nazareth, said at the Bethlehem conference. “We need to do this work of imagination,” Mansour said, “because if we fail to do so, others will fill the gap.”

We must follow the lead of Palestinian Christians and not only question the mechanisms of control and violence but work together to imagine and build a different future.

for more info