Skip to content
A destroyed building on the corniche along the Tigris in Mosul, following battles with ISIS (Photo by Levi Meir Clancy via Unsplash)

Deep Dive: The Acts of Killing

In a new article, Neil Renic and Elke Schwarz argue that autonomous weapons systems reproduce and even intensify past moral challenges.

Words: Emily Tamkin
Pictures: Levi Meir Clancy
Date:

Systematic killing “has long been associated with some of the darkest episodes in human history,” open Neil Renic and Elke Schwarz in their article, “Crimes of Dispassion: Autonomous Weapons and the Moral Challenge of Systematic Killing,” out this month in Ethics and International Affairs. The authors are not attempting to dispute that systematic killing should be associated with dark episodes. Rather, they note that defenders of autonomous weapons systems argue that such systems “will surpass humans not only militarily but also morally, enabling a more precise and dispassionate mode of violence, free of the emotion and uncertainty that too often weaken compliance with the rules and standards of war.”

The authors, however, argue that, on the contrary, lethal autonomous weapons systems reproduce and even intensify past moral challenges, imperiling “essential restraints on the use of military force.”

To demonstrate that they do understand the argument they are debunking, opening with a section on “the allure of autonomous violence.” Some argue that such systems ultimately reflect the intent of those employing them; others, that, depending on how they’re designed and used, such systems could improve compliance with international law. 

We ask, what if instead of preserving or improving upon the ‘goodness’ of human military personnel, an intensified system logic facilitates a worsening of battlefield conduct?

“This is a compelling narrative,” the authors admit, “but it rests on a speculative and superficial understanding of the logical implications of this technology specifically, and systematic violence more generally. It also decontextualizes these weapons to a problematic degree.” 

“We ask, what if instead of preserving or improving upon the ‘goodness’ of human military personnel, an intensified system logic facilitates a worsening of battlefield conduct?”

The authors briefly take their readers through the history of systematic killing. They also reiterate that the issue with lethal autonomous weapons systems is not that of “inhumanity.” Rather, “at issue is the type of humanity this technology makes less and more likely. Autonomous weapons, in delivering us from the passionate, volatile misconduct of human individuals, risk plunging us ever further into the cold, dispassionate misconduct of human systems.”

They point in particular to seeing as a computer (for example, seeing patterns and drawing inferences where none should exist in enemy identification); reducing or overwhelming the agency of the human participant; and truncating the space for commanders’ and operators’ own moral agency. “We should prefer conditions where those charged with doing violence understand the context and consequences of their actions, are able to recognize when they should relent from violence, and have the ability to act upon this impulse rather than becoming removed from the process,” the authors write. It isn’t only that these systems will often fall short of standards, which humans do, too, but that they will lack the very capacity to meet them. 

The authors conclude that these systems can be thought of as ethical in a vacuum. But war is not fought in a vacuum. “War is riven by a complexity that precludes certainty; and by extension, the smooth and reliable application of systematic violence to target objects. To proceed as if this is not the reality, to impose systematic violence upon environments structurally unsuited to such an approach, is to court foreseeable and ruinous moral harm.” Systematic killing should be considered a dark episode in the present, too.

Emily Tamkin

Hey there!

You made it to the bottom of the page! That means you must like what we do. In that case, can we ask for your help? Inkstick is changing the face of foreign policy, but we can’t do it without you. If our content is something that you’ve come to rely on, please make a tax-deductible donation today. Even $5 or $10 a month makes a huge difference. Together, we can tell the stories that need to be told.

SIGN UP FOR OUR NEWSLETTERS