The responsibility gap : ascribing responsibility for the actions of learning automata

Research output: Journal PublicationsJournal Article (refereed)

171 Citations (Scopus)

Abstract

Traditionally, the manufacturer/operator of a machine is held (morally and legally) responsible for the consequences of its operation. Autonomous, learning machines, based on neural networks, genetic algorithms and agent architectures, create a new situation, where the manufacturer/operator of the machine is in principle not capable of predicting the future machine behaviour any more, and thus cannot be held morally responsible or liable for it. The society must decide between not using this kind of machine any more (which is not a realistic option), or facing a responsibility gap, which cannot be bridged by traditional concepts of responsibility ascription.
Original languageEnglish
Pages (from-to)175-183
Number of pages9
JournalEthics and Information Technology
Volume6
Issue number3
DOIs
Publication statusPublished - 1 Sep 2004
Externally publishedYes

Keywords

  • artificial intelligence
  • autonomous robots
  • learning machines
  • liability
  • moral responsibility

Fingerprint Dive into the research topics of 'The responsibility gap : ascribing responsibility for the actions of learning automata'. Together they form a unique fingerprint.

  • Cite this