Subscribe / Unsubscribe Enewsletters | Login | Register

Pencil Banner

Can autonomous killer robots be stopped?

George Nott | Aug. 28, 2017
Advances in AI could make lethal weapon technology devastatingly effective.

 

Do they want a ban?

Some of the signatories have gone a step further than the language of the letter and have called for an outright ban on autonomous weapons “similar to bans on chemical and other weapons”.

A separate group, Campaign to Stop Killer Robots, has long called for an international treaty to stop the technology.

“A comprehensive, pre-emptive prohibition on the development, production and use of fully autonomous weapons–weapons that operate on their own without human intervention – is urgently needed,” the group says.

 

But surely it's better to lose bots in battle than human soldiers?

“Lethal autonomous robots might well reduce collateral damage – just as the current arsenal of fire-and-forget weapons have. This negates the notion that lethal autonomous robots should be declared unlawful per se,” argues Professor Anthony Finn, director of the Defence and Systems Institute at the University of South Australia.

On the other hand, some say replacing human troops with machines removes the disincentive of loss of human life, which could make the decision to go to war much easier so increasing the likelihood of conflict. 

 

Is a ban possible? Would it be effective anyway?

Nearly 200 states worldwide are bound by the Chemical Weapons Convention, which, since the early '90s has led to the destruction of 93 per cent of known stockpiles and the deactivation of 97 declared production facilities.The convention has been successful in massively reducing the use of chemical weapons, although some countries continue to do so.

A ban on lethal autonomous weapons might falter, however, due to the ‘me-tooism’ of opposing states, as it has with nuclear weapons.

Last month 122 countries endorsed a UN treaty to ban the use of nuclear weapons, heralded by its supporters as a big step towards the elimination of all nuclear arms. However, key nuclear-armed states and their allies were absent from the treaty, some of which said recent posturing by North Korea about its nuclear missile capabilities was good reason to keep their own.

“Enforcing such a ban is highly problematic and it might create other problems; stopping countries such as Australia from developing defensive killer robots would leave us vulnerable to other countries and groups that ignore the ban,” says Williams. “A ban on killer robots cannot be the only strategy. Society and nations need much more than a killer robot ban.”

 

What about banning further research?

Many researchers are guided by ethical norms. However research ethics are not bound by law, and vary between institutions and countries.

"It is an excellent idea to consider the positives and the negatives of autonomous systems research and to ban research that is unethical,” says Dr Michael Harre, lecturer with the Faculty of Engineering and Information Technologies at the University of Sydney. However, he adds, there needs to be a “closer examination of what constitutes 'ethical' research”.

 

Previous Page  1  2  3  Next Page 

Sign up for CIO Asia eNewsletters.