Prohibiting Autonomous Weapons: Put Human Dignity First

AuthorElvira Rosert,Frank Sauer
Published date01 September 2019
DOIhttp://doi.org/10.1111/1758-5899.12691
Date01 September 2019
Prohibiting Autonomous Weapons: Put Human
Dignity First
Elvira Rosert
Universit
at Hamburg and Institute for Peace Research and Security Policy (IFSH)
Frank Sauer
Bundeswehr University Munich
Abstract
In addition to its successful mobilization in stigmatization and norm-setting processes on anti-personnel landmines and cluster
munitions, the principle of distinction as enshrined in International Humanitarian Law also f‌igures prominently in the debate
on lethal autonomous weapons systems (LAWS). Proponents of a ban on LAWS frame these as indiscriminate, that is, unable
to distinguish between civilians and combatants, and thus as inherently unlawful. The f‌lip side of this particular legal argu-
ment is, however, that LAWS become acceptable when considered capable of distinguishing between combatants and civil-
ians. We thus argue, f‌irst, that this particular legal basis for the call for a ban on LAWS might be rendered obsolete by
technological progress increasing discriminatory weapon capabilities. Second, we argue that the argument is normatively trou-
bling as it suggests that, as long as civilians remain unharmed, attacking combatants with LAWS is acceptable. Consequently,
we f‌ind that the legal principle of distinction is not the overall strongest argument to mobilize when trying to stigmatize and
ban LAWS. A more fundamental, ethical argument within the debate about LAWS and one less susceptible to technological
f‌ixesshould be emphasized instead, namely that life and death decisions on the battlef‌ield should always and in principle
be made by humans only.
Lethal autonomous weapons systems: a threat to
human dignity
Numerous arguments motivate the current call for an inter-
national, legally binding ban on so-called lethal autonomous
weapons systems (LAWS).
1
Strategic concerns include prolif-
eration, arms races and escalation risks (Altmann and Sauer,
2017; Rickli, 2018). Military concerns include the incompati-
bility of LAWS with a traditional chain of command or the
potential for operational failures cascading at machine
speed (Bode and Huelss, 2018; Scharre, 2016). Ethical con-
cerns include the fear that LAWS might further increase the
dehumanization and abstractness of war (and thus its
propensity), as well as its cruelty if warfare is delegated to
machines incapable of empathy or of navigating in dilem-
matic situations (Krishnan, 2009; Sauer and Sch
ornig, 2012;
Sparrow, 2015; Sparrow et al., 2019; Wagner, 2014). Legal
concerns include diff‌iculties of attribution, accountability
gaps, and limits to the fulf‌illment of obligatory precaution-
ary measures (Brehm, 2017; Chengeta, 2017; Docherty,
2015). But the most prominent concern, focalizing some ele-
ments of the concerns just mentioned, is the danger these
weapons pose to civilians. This arguments legal underpin-
ning is the principle of distinction undoubtedly one of the
central principles of International Humanitarian Law (IHL), if
not the central principle (Dill, 2015).
As multifaceted and complex as the debate on military
applications of autonomy is now, what has been articulated
at its very beginning (Altmann and Gubrud, 2004; Sharkey,
2007) and consistently since then is that LAWS would vio-
late IHL due to their inability to distinguish between com-
batants and civilians. This image of LAWS as a threat to
civilians is echoed routinely and placed f‌irst by all major
ban supporters (we substantiate this claim in the following
section). That LAWS would be incapable of making this cru-
cial distinction and thus have to be considered indiscrimi-
nate is assumed because civilian-nessis an under-
def‌ined, complex and heavily context-dependent concept
that is not translatable into software (regardless of whether
the software is based on rules or on machine learning). Rec-
ognizing and applying this concept on the battlef‌ield not
only requires value-based judgments but also a degree of
situational awareness as well as an understanding of social
context that current and foreseeable computing technology
does not possess.
We unequivocally share this view as well as these con-
cerns. And yet, in this article, we propose to de-emphasize
the indiscriminateness frame in favor of a deeper ethical
assertion, namely that the use of LAWS would infringe on
human dignity. The minimum requirement for upholding
human dignity, even in conf‌licts, is that life and death deci-
sions on the battlef‌ield should always and in principle be
made by humans (Asaro, 2012; Gubrud, 2012). Not the risk
of (potential) civilian harm, but rather retaining meaningful
human control to preserve human dignity should be at the
core of the message against LAWS.
2
©2019 University of Durham and John Wiley & Sons, Ltd. Global Policy (2019) 10:3 doi: 10.1111/1758-5899.12691
Global Policy Volume 10 . Issue 3 . September 2019
370
Special Section Article

To continue reading

Request your trial

VLEX uses login cookies to provide you with a better browsing experience. If you click on 'Accept' or continue browsing this site we consider that you accept our cookie policy. ACCEPT