Earlier this week robotics and artificial intelligence experts signed an open letter (PDF) calling on the United Nations to help prevent the “third revolution in warfare”: Lethal autonomous weapons.
“Once developed, they will permit armed conflict to be fought at a scale greater than ever, and at timescales faster than humans can comprehend,” ,” the letter states. “These can be weapons of terror, weapons that despots and terrorists use against innocent populations, and weapons hacked to behave in undesirable ways. We do not have long to act. Once this Pandora’s box is opened, it will be hard to close.”
What is a lethal autonomous weapon?
Autonomous weapons select and engage targets without human intervention. They might include, for example, armed quadcopters that can be sent to search for and shoot anyone in a pre-determined area. They don’t include remotely piloted drones with humans in-the-loop to ‘pull the trigger’ or active protection systems, such as fixed sentry guns which fire at targets detected by sensors to defend an area.
How likely are they?
According to Ryan Gariepy, founder and CTO of Clearpath Robotics, very likely.
“Unlike other potential manifestations of AI which still remain in the realm of science fiction, autonomous weapons systems are on the cusp of development right now and have a very real potential to cause significant harm to innocent people along with global instability,” he said.
"This is not a hypothetical scenario, but a very real, very pressing concern which needs immediate action."
Rapid technological advances means fully autonomous weapons are possible, and many companies are pursuing that goal. The Australian Defence force in July announced a $50 million research centre to develop ‘Trusted Autonomous Systems’.
UTS Professor Mary-Anne Williams imagines a rather terrifying near-future: “Weaponised robots could be like the velociraptors in Jurassic Park, with agile mobility and lighting fast reactions, able to hunt humans with high precision sensors augmented with information from computer networks,” she said.
What do the experts want?
The latest open letter (a similar one was issued in 2015) calls on the United Nations to take action “to find a way to protect us all from these dangers”.
A UN Group of Governmental Experts (GGE) on Lethal Autonomous Weapon Systems was due to meet for the first time this week, but the event was cancelled and rescheduled for November.
“We entreat the high contracting parties participating in the GGE to work hard at finding means to prevent an arms race in these weapons, to protect civilians from their misuse, and to avoid the destabilising effects of these technologies,” the letter continues.
Sign up for CIO Asia eNewsletters.