The next generation: the UN considers the potential of lethal autonomous robots

In late May, the UN Special Rapporteur on Extrajudicial, Summary or Arbitrary Executions, Christof Heyns, issued his report to the Human Rights Council on lethal autonomous robots.  He defined these as “weapon systems that, once activated, can select and engage targets without human intervention”.  His concern at the development of this technology was the inability of the legal frameworks of both international humanitarian law and international human rights law to adequately govern the use of lethal autonomous robots.  The report highlighted the impact of this technology, particularly on the psychological and physical distance from war that they allowed, which reduced the concerns normally associated with armed conflict.  In this respect, lethal autonomous robots “seem to take problems that are present with drones and high-altitude airstrikes to their factual and legal extreme”.

Among his recommendations, Heyns called for the Human Rights Council to ask “States to declare and implement national moratoria on “at least the testing, production, assembly, transfer, acquisition, deployment and use of LARs until such time as an internationally agreed upon framework on the future of LARs has been established.”  Though he stopped short of going as far as the call for a ban made by the Campaign to Stop Killer Robots, which launched in April. 

At the 23rd Session of the Human Rights Council, the Pakistan Government, speaking on behalf of the Organisation of Islamic States, concurred with Heyns on the need to take action though questioned whether there was a need to go beyond a moratorium and to “initiate an international process with a view to ban the use of LARs”.  The United States agreed that LARs presented “important legal, policy and ethical issues” and called on states to “proceed in a lawful, prudent and responsible manner when considering whether to incorporate automated and autonomous capabilities in weapons systems.”  Like the statement from the EU, the US emphasised the concept that discussion of this issue went beyond the Human Rights Council’s core expertise and argued for the need for such a discussion to take place in a forum with a focus on international humanitarian law.

We have already seen the strain placed on international humanitarian and human rights law by the use of semi-autonomous drones and the challenges brought by states on the interpretation of terms such as direct participation in hostilities.  These challenges have not been adequately resolved, despite Obama’s recent attempt to clarify his Administration’s position on these issues.  This failure gives rise to the fear that progress towards greater weapons autonomy will only serve to further erode the relevant legal frameworks.

The “killer robots” discussed by the Special Rapporteur and the Campaign to Stop Killer Robots are significant in that they are the end of the drones continuum.  The result of an ever increasing autonomised approach to warfare.  Central to the arguments put forward by Heyns and the Campaign is the assertion that “taking humans out of the loop also risks taking humanity out of the loop”, with this comes the demise of decision making.  Without this then the moral and legal codes which govern the right to life are similarly devalued.   So while issues such as proportionality may be only achievable with human involvement, such a principle is only as good as the criteria set out in the rules of engagement.   It would be a grave mistake to think that the presence of a human somehow ensures that the use any weapon is a failsafe approach.  In this respect, it is as much about the weapon as it is about the policy surrounding it.  For example, increased civilian deaths by drones are, arguably, less the result of a new weapon and more the result of states’ mandate that this weapon can be used in specific circumstances where civilians are at risk combined with an expansion of the criteria of those who can be targeted (see, for example, the US’s decision to allow signature strikes).

Christof Heyns emphasised the need for states to be transparent about their weapons review processes.  A recent paper by Article 36, a partner in the Campaign to Stop Killer Robots, examined the UK’s policy position on fully autonomous weapons, and noted that while the UK Government does undertake the relevant reviews, these are not made public.  Andrew Robathon’s response to Tom Watson’s, Chair of the APPG, question on the UK’s compliance with Article 36 of Additional Protocol 1 to the Geneva Conventions, stated that:

The Reaper Unmanned Air System has been the subject of legal reviews during the acquisition process, in accordance with the UK’s responsibilities under article 36 of Protocol I Additional to the Geneva conventions of 1949. The reviews concluded that Reaper is capable of being used lawfully and in accordance with all relevant international and domestic law. The reviews are subject to legal professional privilege and I am unwilling, therefore, to place copies in the Library of the House.

And thus the ability of Parliamentarians, and the wider public, to evaluate the Government’s assessment that this weapon is compliant with the relevant frameworks is stymied.  While Article 36 note that, to date, the UK Government has committed in Parliament that such semi-autonomous weapons should remain just that, under human control; the Joint Doctrine, produced by the Ministry of Defence, on the UK’s approach to unmanned systems, appears to challenge this commitment.   Further, the UK’s response to Christof Heyns noted that the UK considered the “existing provisions of international law are sufficient to regulate the use of such systems and therefore has no plans to call for or to support an international ban on them” and re-iterated their commitment to the Geneva Conventions.   A clear case can thus be made for the UK Government to further publicly articulate its consideration of the development of LARs.

Leave a Reply

Your email address will not be published. Required fields are marked *