Daily Bulletin

The Conversation

  • Written by The Conversation
imageThe military robots in Marvel's Iron Man 2 might not be so far from reality.Marvel Studios/Paramount Pictures

In an open letter I helped publish on July 28 – which has now been signed by more than 2,700 artificial intelligence (AI) and robotics researchers from around the world – we stated that “starting a military AI arms race is a bad idea, and should be prevented by a ban on offensive autonomous weapons beyond meaningful human control”.

A few days later, philosopher Jai Galliott challenged the notion of a ban, recommending instead that we welcome offensive autonomous weapons – often called “killer robots” – rather than ban them.

I was pleased to read Jai’s recommendation, even if he calls the open letter I helped instigate “misguided” and “reckless”, and even if I disagree with him profoundly.

This is a complex and multi-faceted problem, and it is worth considering his arguments in detail as they bring several important issues into focus.

Four points

Jai puts forward four arguments why a ban is not needed:

  1. No robot can really kill without human intervention

  2. We already have weapons of the kind for which a ban is sought

  3. The real worry is the development of sentient robots, and

  4. UN bans are virtually useless.

Let’s consider the claims in turn.

The first argument is that robots cannot kill without human intervention. This is false. The Samsung SGR-A1 sentry robot being used today in the Korean DMZ has an automatic mode. When in this mode, it will identify and kill targets up to four kilometres away without human intervention. If you are in the DMZ, it will track you and – unless you unambiguously raise your hands in surrender – it will kill you.

The Samsung SGR-A1 can run in autonomous mode, and has already been deployed in South Korea.

The second argument is that we already have weapons of the kind for which a ban is sought. To illustrate this, he mentions the Phalanx close-in weapon system used by the Australian Navy. This completely misses the point, as the Phalanx is a defensive weapon system. Our open letter specifically called only for a ban on offensive weapon systems. We have nothing against defensive weapons.

However, whether the weapons we seek to ban exist or not is irrelevant to our core argument that they ought to be banned. Anti-personnel mines existed before a ban was put in place with the Ottawa Treaty. And 46 million such mines have since been destroyed.

Blinding lasers had been developed by both China and the US before the UN ban was put in place in 1998. And blinding lasers are not in use in the Syria or any other battlefield around the world today.

So whether or not you believe offensive autonomous weapons already exist, it doesn’t undermine our our call for a ban.

The third argument is that the real worry is the development of sentient robots. This is also false. We do not discuss sentient weapons at all. Our call for a ban is independent of whether robots ever gain sentience.

Sentient robots like Hollywood’s Terminator would be a very bad thing. Even stupid AI in killer robots that are non-sentient would be a very bad thing. We need a ban today to protect mankind from swarms of armed quadcopters, technology that is practically on the shelves of hardware stores today.

The final argument claims UN bans are virtually useless. This also is false. The UN has very successfully banned biological weapons, space-based nuclear weapons, and blinding laser weapons. And even for arms such as chemical weapons, land mines, and cluster munitions, where UN bans have been breached or not universally ratified, severe stigmatisation has limited their use. UN bans are thus definitely worth having.

What’s the endpoint?

What I view as the central weakness of the arguments advanced in Jai’s article is that they never addresses the main argument of the open letter: that the endpoint of an AI arms race will be disastrous for humanity.

The open letter proposes a solution: attempting to stop the arms race with an arms control agreement.

The position Jai takes, on the other hand, suggests we should welcome the development of offensive autonomous weapons. Yet it fails to describe what endpoint this will lead to.

It also never attempts to explain why a ban is supported by thousands of AI and robotics experts, by the ambassadors of Germany and Japan, by the International Committee of the Red Cross, by the editorial pages of the Financial Times, and indeed (for the time being) by the US Department of Defense, other than with a dismissive remark about “scaremongering”.

Anybody criticising an arms-control proposal endorsed by such a diverse and serious-minded collection of people and organisations needs to explain clearly what endpoint they are proposing instead, and should not advance arguments against a ban that are either false or irrelevant to the issue.

Toby Walsh receives funding from the Australian Research Council, the Department of Communications, the Asian Office of Aerospace Research and Development and the Humboldt Foundation.

Authors: The Conversation

Read more http://theconversation.com/we-should-not-dismiss-the-dangers-of-killer-robots-so-quickly-45935

Writers Wanted

I regret stopping breastfeeding. How do I start again?


The missing question from New Zealand's cannabis debate: what about personal freedom and individual rights?


The Conversation


Did BLM Really Change the US Police Work?

The Black Lives Matter (BLM) movement has proven that the power of the state rests in the hands of the people it governs. Following the death of 46-year-old black American George Floyd in a case of ...

a Guest Writer - avatar a Guest Writer

Scott Morrison: the right man at the right time

Australia is not at war with another nation or ideology in August 2020 but the nation is in conflict. There are serious threats from China and there are many challenges flowing from the pandemic tha...

Greg Rogers - avatar Greg Rogers

Prime Minister National Cabinet Statement

The National Cabinet met today to discuss Australia’s COVID-19 response, the Victoria outbreak, easing restrictions, helping Australians prepare to go back to work in a COVID-safe environment an...

Scott Morrison - avatar Scott Morrison

Business News

What happens to all those pallets?

Pallets — they're not something everyday people often give much thought to. But they're an integral part of any business which receives or distributes large quantities of goods. But once the goo...

News Company - avatar News Company

Ten tips for landing a freelance transcription job

Transcription jobs are known to be popular in the field of freelancing. They offer fantastic job opportunities to a lot of people, but there are some scammers who wait to cheat the freelancers. ...

News Company - avatar News Company

How To Remove Rubbish More Effectively

It can be a big task to remove household rubbish. The hardest part is finding the best way to get rid of your junk. It can be very overwhelming to know exactly where to start with so many option...

News Company - avatar News Company

News Company Media Core

Content & Technology Connecting Global Audiences

More Information - Less Opinion