Killer robots: should algorithms decide who lives or dies?

Published: Sept. 21, 2021, 8 a.m.

b'

In Geneva, complex negotiations are underway to decide if a treaty is needed to control, or even ban, lethal autonomous weapons \\u2013 or killer robots. Imogen Foulkes talks to experts, lawyers, and campaigners.

"It\\u2019s about the risk of leaving life and death decisions to a machine process. An algorithm shouldn\\u2019t decide who lives or dies," says Neil Davison, Senior Policy Adviser at the International Committee of the Red Cross.\\xa0

"Do you hold the commander responsible, who activated the weapons system?" asks Mary Wareham of Human Rights Watch.\\xa0

"What if a weapon is used and developed without meaningful human control, what are the consequences of it? How do you ascribe responsibility?" ponders Paola Gaeta, an international law expert at Geneva's Graduate Institute.

"If we don\\u2019t have a treaty within two years we will be too late. Technology is progressing at a much faster pace than diplomacy is doing, and I fear the worst," warns Frank Slijper of Pax, a Dutch peace organization.


Get in touch!

Thank you for listening! If you like what we do, please leave a review.

'