The responsibility gap : ascribing responsibility for the actions of learning automata

Research output: Journal PublicationsJournal Article (refereed)

151 Citations (Scopus)


Traditionally, the manufacturer/operator of a machine is held (morally and legally) responsible for the consequences of its operation. Autonomous, learning machines, based on neural networks, genetic algorithms and agent architectures, create a new situation, where the manufacturer/operator of the machine is in principle not capable of predicting the future machine behaviour any more, and thus cannot be held morally responsible or liable for it. The society must decide between not using this kind of machine any more (which is not a realistic option), or facing a responsibility gap, which cannot be bridged by traditional concepts of responsibility ascription.
Original languageEnglish
Pages (from-to)175-183
Number of pages9
JournalEthics and Information Technology
Issue number3
Publication statusPublished - 1 Sep 2004
Externally publishedYes



  • artificial intelligence
  • autonomous robots
  • learning machines
  • liability
  • moral responsibility

Cite this