| Keyword search (3,940 papers available) | ![]() |
Author(s): Dominic Martin
Who should decide how a machine will decide what to do when it is driving a car, performing a medical procedure, or, more generally, when it is facing any kind of morally laden decision? More and more, machines are making complex decisions with a considerab...
Article GUID: 27905083
| Title: | Who Should Decide How Machines Make Morally Laden Decisions? |
| Authors: | Dominic Martin |
| Link: | https://pubmed.ncbi.nlm.nih.gov/27905083/ |
| DOI: | 10.1007/s11948-016-9833-7 |
| Category: | |
| PMID: | 27905083 |
| Dept Affiliation: | JMSB
1 John Molson School of Business, Concordia University, Montréal, Canada. dominic.martin@concordia.ca. 2 Department of Philosophy, McGill University, Montréal, Canada. dominic.martin@concordia.ca. |
Description: |
Who should decide how a machine will decide what to do when it is driving a car, performing a medical procedure, or, more generally, when it is facing any kind of morally laden decision? More and more, machines are making complex decisions with a considerable level of autonomy. We should be much more preoccupied by this problem than we currently are. After a series of preliminary remarks, this paper will go over four possible answers to the question raised above. First, we may claim that it is the maker of a machine that gets to decide how it will behave in morally laden scenarios. Second, we may claim that the users of a machine should decide. Third, that decision may have to be made collectively or, fourth, by other machines built for this special purpose. The paper argues that each of these approaches suffers from its own shortcomings, and it concludes by showing, among other things, which approaches should be emphasized for different types of machines, situations, and/or morally laden decisions. |