Bellaby, R.W. orcid.org/0000-0002-6975-0681 (2021) Can AI weapons make ethical decisions? Criminal Justice Ethics, 40 (2). pp. 86-107. ISSN 0731-129X
Abstract
The ability of machines to make truly independent and autonomous decisions is a goal of many, not least of military leaders who wish to take the human out of the loop as much as possible, claiming that autonomous military weaponry—most notably drones—can make decisions more quickly and with greater accuracy. However, there is no clear understanding of how autonomous weapons should be conceptualized and of the implications that their “autonomous” nature has on them as ethical agents. It will be argued that autonomous weapons are not full ethical agents due to the restrictions of their coding. However, the highly complex machine-learning nature gives the impression that they are making their own decisions and creates the illusion that their human operators are protected from the responsibility of the harm they cause. Therefore, it is important to distinguish between autonomous AI weapons and an AI with autonomy, a distinction that creates two different ethical problems for their use. For autonomous weapons, their limited agency combined with machine-learning means their human counterparts are still responsible for their actions while having no ability to control or intercede in the actual decisions made. If, on the other hand, an AI could reach the point of autonomy, the level of critical reflection would make its decisions unpredictable and dangerous in a weapon.
Metadata
Item Type: | Article |
---|---|
Authors/Creators: |
|
Copyright, Publisher and Additional Information: | © 2021 The Author(s). Published by Informa UK Limited, trading as Taylor & Francis Group on behalf of John Jay College of Criminal Justice of The City University of New York. This is an Open Access article distributed under the terms of the Creative Commons Attribution-NonCommercial-NoDerivatives License (http://creativecommons.org/licenses/by-nc-nd/4.0/), which permits non-commercial re-use, distribution, and reproduction in any medium, provided the original work is properly cited, and is not altered, transformed, or built upon in any way. |
Keywords: | artificial intelligence; ethics; autonomous; autonomy; weapons |
Dates: |
|
Institution: | The University of Sheffield |
Academic Units: | The University of Sheffield > Faculty of Social Sciences (Sheffield) > Department of Politics and International Relations (Sheffield) |
Depositing User: | Symplectic Sheffield |
Date Deposited: | 30 Jul 2021 09:53 |
Last Modified: | 09 Mar 2022 11:48 |
Status: | Published |
Publisher: | Taylor & Francis |
Refereed: | Yes |
Identification Number: | 10.1080/0731129x.2021.1951459 |
Open Archives Initiative ID (OAI ID): | oai:eprints.whiterose.ac.uk:176676 |
Download
Filename: Can AI Weapons Make Ethical Decisions.pdf
Licence: CC-BY-NC-ND 4.0