Issues in robot ethics seen through the lens of a moral Turing test

Published date11 May 2015
DOIhttps://doi.org/10.1108/JICES-09-2014-0038
Date11 May 2015
Pages98-109
AuthorAnne Gerdes,Peter Øhrstrøm
Subject MatterInformation & knowledge management,Information management & governance
Issues in robot ethics seen
through the lens of a moral
Turing test
Anne Gerdes
Department of Design and Communication,
University of Southern Denmark, Kolding, Denmark, and
Peter Øhrstrøm
Department of Comunication and Psychology, Aalborg University,
Aalborg, Denmark
Abstract
Purpose – The purpose of this paper is to explore articial moral agency by reecting upon the
possibility of a Moral Turing Test (MTT) and whether its lack of focus on interiority, i.e. its
behaviouristic foundation, counts as an obstacle to establishing such a test to judge the performance of
an Articial Moral Agent (AMA). Subsequently, to investigate whether an MTT could serve as a useful
framework for the understanding, designing and engineering of AMAs, we set out to address
fundamental challenges within the eld of robot ethics regarding the formal representation of moral
theories and standards. Here, typically three design approaches to AMAs are available: top-down
theory-driven models and bottom-up approaches which set out to model moral behaviour by means of
models for adaptive learning, such as neural networks, and nally, hybrid models, which involve
components from both top-down and bottom-up approaches to the modelling of moral agency. With
inspiration from Allen and Wallace (2009, 2000) as well as Prior (1949, 2003), we elaborate on
theoretically driven approaches to machine ethics by introducing deontic tense logic. Finally, within
this framework, we explore the character of human interaction with a robot which has successfully
passed an MTT.
Design/methodology/approach The ideas in this paper reect preliminary theoretical
considerations regarding the possibility of establishing a MTT based on the evaluation of moral
behaviour, which focusses on moral reasoning regarding possible actions. The thoughts reected fall
within the eld of normative ethics and apply deontic tense logic to discuss the possibilities and
limitations of articial moral agency.
Findings – The authors stipulate a formalisation of logic of obligation, time and modality, which may
serve as a candidate for implementing a system corresponding to an MTT in a restricted sense. Hence,
the authors argue that to establish a present moral obligation, we need to be able to make a description
of the actual situation and the relevant general moral rules. Such a description can never be complete, as
the combination of exhaustive knowledge about both situations and rules would involve a God eye’s
view, enabling one to know all there is to know and take everything relevant into consideration before
making a perfect moral decision to act upon. Consequently, due to this frame problem, from an
engineering point of view, we can only strive for designing a robot supposed to operate within a
restricted domain and within a limited space-time region. Given such a setup, the robot has to be able to
perform moral reasoning based on a formal description of the situation and any possible future
developments. Although a system of this kind may be useful, it is clearly also limited to a particular
context. It seems that it will always be possible to nd special cases (outside the context for which it was
designed) in which a given system does not pass the MTT. This calls for a new design of moral systems
with trust-related components which will make it possible for the system to learn from experience.
The current issue and full text archive of this journal is available on Emerald Insight at:
www.emeraldinsight.com/1477-996X.htm
JICES
13,2
98
Received 1 September 2014
Revised 1 September 2014
Accepted 9 October 2014
Journalof Information,
Communicationand Ethics in
Society
Vol.13 No. 2, 2015
pp.98-109
©Emerald Group Publishing Limited
1477-996X
DOI 10.1108/JICES-09-2014-0038

To continue reading

Request your trial

VLEX uses login cookies to provide you with a better browsing experience. If you click on 'Accept' or continue browsing this site we consider that you accept our cookie policy. ACCEPT