Killer Robots Campaign Launched in London

Killer Robots Campaign Launched in London

On Tuesday 23rd April 2013, Human Rights Watch, Article 36, the International Committee for Robot Arms Controls (ICRAC) and the Nobel Women’s Initiative launched the Campaign to Stop Killer Robots in London.  The Campaign calls for a pre-emptive ban on the development and testing of autonomous weapons systems to be achieved by a “new international law (a treaty), as well as through national laws and other measures”.  For the uninitiated, we refer to Human Rights Watch’s excellent report on this issue, Losing Humanity which sets out the three categories of weapons platforms.  “Human in the loop” weapons require a human to select the targets and decide when to use lethal force.  “Human on the loop” weapons enable a target to be selected and force delivered by the machine under the oversight of a human, meaning that the action can be over-ridden.  The final category, and the target of the campaign are  “Human out of the loop” weapons which are capable of selecting targets and delivering lethal force without any human input.

The concern expressed by the campaign is that a number of states, led by the US, are moving closer to the development and use of fully autonomous weapons.  These “killer robots” are unable to comply with the principles of international humanitarian law, such as the distinction between civilians and combatants.  There is also scope to acknowledge the argument about the dangers of their use in other situations, where international humanitarian law does not apply, and the subsequent challenge posed to international human rights law. The campaign highlights the fact that fully autonomous robots give rise to serious questions about accountability, for example, who would bear  legal responsibility if an attack went wrong; as well as concerns that this technology increases the likelihood of conflict as states would be less inhibited concerns of military casualties.

In the campaign’s launch statement, Professor Noel Sharkey of ICRAC is quoted as saying:

Killer robots are not self-willed; Terminator’-style robots, but computer directed weapons systems that once launched can identify targets and attack them without further human involvement. Using such weapons against an adaptive enemy in unanticipated circumstances and in an unstructured environment would be a grave military error. Computed controlled devices can be hacked, jammed, spoofed, or can be simply fooled and misdirected by humans.

Beginning with an NGO conference, held at the Amnesty Human Rights Centre, the campaign was officially launched with a press conference and a subsequent briefing for Parliamentarians in Westminster.  The campaign also brought a robot to Parliament Square and delivered a letter to the UK government requesting that they elaborate on their policies concerning autonomous weapons and support the ban.  A paper, written by Article 36 and released in tandem with the campaign, highlights some of the shortcomings in the UK’s position on this issue.

Addressing the Westminster event was Steve Goose, Director of the Arms Division of Human Rights Watch.  He was accompanied by campaign partners Jody Williams, 1997 Nobel Peace Laureate and Chair of the Nobel Women’s Initiative, Professor Noel Sharkey, ICRAC and Richard Moyes of Article 36.  The speakers raised concern at the perceived rush by members of the US military toward the development of full autonomous weapons.  The subsequent discussion focused on the legitimacy of current automated maritime protection systems as well as the ability for target recognition systems to be confused by simple disguise techniques and hacking.

This campaign seeks to prevent the world from sleepwalking into an age defined by autonomous warfare.  Weapons bans have, for the most part, focused on weapons already in existence.  In contrast, this campaign, recognising the success achieved by the ban on laser weapons, seeks to halt not only the use of this technology but its development.  By creating public awareness on this issue and highlighting the grave moral, legal and ethical challenges posed by “killer robots” in 2013, rather than 2023, the campaign can potentially create a global environment where such technology cannot and will not be tolerated.

Leave a Reply

Your email address will not be published. Required fields are marked *