In an open letter, 156 artificial-intelligence experts from 14 European countries (go.nature.com/2t5mgov) have rejected the European Parliament’s recommendation that robots should have legal status as electronic persons. This would make robots responsible for repairing any damage they might cause (go.nature.com/2wxlwg6). We are not signatories to the open letter, but endorse it nonetheless.
In our view, the parliament’s recommendation is flawed. Its rationale seems to be that robots can be electronic juridical persons in the same way as companies are. But companies are constituted and run by real people. That is why they can be meaningfully attributed with intentions, plans, goals, legal rights and duties, and why they can be taught, praised or punished. Hence, they are considered to be responsible, accountable or liable for their actions.
Attributing electronic personhood to robots risks misplacing moral responsibility, causal accountability and legal liability regarding their mistakes and misuses. Robots could be blamed and punished instead of humans. And irresponsible people would dismiss the need for care in the engineering, marketing and use of robots. Even the Romans knew better: the owner of an enslaved person was fully responsible for any damage caused by that person (known as vicarious liability).
Nature 557, 309 (2018)