Daily Bulletin

The Times Real Estate

.

  • Written by Rain Liivoja, Associate Professor, TC Beirne School of Law, The University of Queensland
Why it's so hard to reach an international agreement on killer robots

For several years, civil society groups have been calling for a ban on what they call “killer robots”. Scores of technologists have lent their voice to the cause. Some two dozen governments now support a ban and several others would like to see some kind of international regulation.

Yet the latest talks on “lethal autonomous weapons systems” wrapped up last month with no agreement on a ban. The Group of Governmental Experts meeting, convened in Geneva under the auspices of the United Nations Convention on Certain Conventional Weapons, did not even clearly proceed towards one. The outcome was a decision to continue discussions next year.

Those supporting a ban are not impressed. But the reasons for the failure to reach agreement on the way forward are complex.

Read more: Lack of technical knowledge in leadership is a key reason why so many IT projects fail

What to ban?

The immediate difficulty concerns articulating what technology is objectionable. The related, deeper question is about whether increased autonomy of weapons is always bad.

Many governments, including Germany, Spain and the United Kingdom, have said they do not have, and do not want, weapons wholly uncontrolled by humans. At the same time, militaries already own weapons that, to some degree, function without someone pulling the trigger.

Since the 1970s, navies have used so-called close-in weapon systems (CWIS). Once switched on, these weapons can automatically shoot down incoming rockets and missiles as the warship’s final line of defence. Phalanx, with its distinctively shaped radar dome, is probably the best-known weapon system of this kind.

Armies now deploy land-based variants of CWIS, generally known as C-RAM (short for counter-rocket, artillery and mortar), for the protection of military bases.

Other types of weapons also have autonomous functionality. For example, sensor-fuzed weapons, fired in the general direction of their targets, rely on sensors and preset targeting parameters to launch themselves at individual targets.

None of these weapons has stirred significant controversy.

The acceptable vs the unacceptable

What exactly is the dreaded “fully autonomous” weapon system that no-one has much appetite for? Attempts to answer this question over the past few years have not enjoyed success.

The supporters of a ban note – correctly – that the lack of a precise definition has not stopped arms control negotiations before. They point to the Convention on Cluster Munitions, signed in 2008, as an example.

The notion of a cluster munition – a large bomb that disperses small unguided bomblets – was clear enough from the outset. Yet the precise properties of the banned munition were agreed upon later in the process.

Unfortunately, the comparison between cluster munitions and autonomous weapons does not quite work. Though cluster munitions were a loose category to start, it was clear they could be categorised by technical criteria.

In the end, the Convention on Cluster Munitions draws a line between permissible and prohibited munitions by reference to things such as the number, weight and self-destruction capability of submunitions.

With regard to any similar rules on autonomous weapon systems, it is not only unclear where the line should to be drawn between what is and isn’t permissible, it is also unclear what criteria to use for drawing it.

How much human control?

One way out of this thicket of definitions is to shift the focus from the weapon itself to the way the human interacts with the weapon. Rather than debate what to ban, governments should agree on the necessary degree of control humans should exercise. Austria, Brazil and Chile have suggested starting treaty negotiations precisely along those lines.

This change of perspective may well prove to be helpful. But the key problem is thereby transformed rather than resolved. The question now becomes: what kind of human involvement is needed and when must it occur?

A strict idea of human control would entail a human making a conscious decision about each individual target in real time. This approach would cast a shadow on the existing weapon systems mentioned earlier.

A strict reading of human control might also require the operator to have the ability to abort a weapon until the moment it hits a target. This would raise questions about even the simplest of weapons – rocks, spears, bullets or gravity bombs – which leave human hands at some point.

An alternative understanding of human control would consider the weapon’s broader design, testing, acquisition and deployment processes. It would admit, for example, that a weapon preprogrammed by a human is in fact controlled by a human. But some would consider programming to be a poor and unpalatable substitute for a human acting at the critical time.

In short, the furious agreement about the need to maintain human involvement hides a deep disagreement about what that means. This is not a mere semantic dispute. It is an important and substantive disagreement that defies an easy resolution.

The benefits of autonomy

Some governments, such as the United States, argue that autonomous functions in weapons can yield military and humanitarian benefits.

Read more: Three ways robots can save lives in war

They suggest, for example, that reducing the manual control that a human has over a weapon, might increase its accuracy. This, in turn, could help avoid unintended harm to civilians.

Others find even the notion of benefits in this context to be too much. During the last Group of Governmental Experts meeting, several Latin American governments, most prominently Costa Rica and Cuba, opposed any reference to potential benefits. In their view, autonomy in weapon systems only poses risks and challenges, which need to be mitigated through further regulation.

This divide reveals an underlying uncertainty about the aims of international law in armed conflict. For some, desirable outcomes – surgical use of force, reduced collateral damage, and so on – prevail. For others, the instruments of warfare must (sometimes) be restricted no matter the outcomes.

The next step

Supporters of the ban suggest that a handful of powerful states, particularly the US and Russia, are blocking further negotiations.

This does not seem entirely accurate. Disagreements about the most appropriate way forward are much broader and quite fundamental.

Addressing the challenges of autonomous weapons is therefore not just a matter of getting a few recalcitrant governments to fall in line. Much less is it about verbally abusing them into submission.

If there is to be further regulation, and if that regulation is to be effective, the different viewpoints must be taken seriously – even if one disagrees with them. A quick fix is unlikely and, in the long term, probably counterproductive.

Authors: Rain Liivoja, Associate Professor, TC Beirne School of Law, The University of Queensland

Read more http://theconversation.com/why-its-so-hard-to-reach-an-international-agreement-on-killer-robots-102637

Business News

How Australian Startups Should Responsibly Collect, Use and Store Customer Data?

Owing to the digital landscape, data is the most important currency in the market. From giant e-commerce sharks to small businesses, every company is investing heavily to responsibly collect data an...

Daily Bulletin - avatar Daily Bulletin

Revolutionising Connections - The Power of Customer Engagement Software

As time goes by, customer expectations keep on rising ever so rapidly. Businesses that must keep pace will need future-ready tools to deliver connectedness at every touchpoint. Customer engagement a...

Daily Bulletin - avatar Daily Bulletin

Benefits of Outsourced Bookkeeping for Growing Businesses

Outsourced bookkeeping can have numerous benefits regardless of the size of business. The main advantage being it can provide more than just cost savings. So, if you are thinking of outsourcing your b...

Daily Bulletin - avatar Daily Bulletin

LayBy Deals