Beware the Killer Robots
March 10, 2017 | Professor Toby Walsh, UNSW SydneyEstimated reading time: 4 minutes

Autonomous weapons have moved from science fiction to become a clear and present danger. But there is still time to stop them.
In July 2015, thousands of researchers working in artificial intelligence (AI) and robotics united to issue an open letter calling for a pre-emptive ban on such weapons. I was one of the organisers of the letter, and I have spoken several times at the United Nations to reinforce our call for a ban.
The reason I have been motivated to do this is simple. If we don’t get a ban in place, there will be an arms race. And the end point of this race will look much like the dystopian future painted by Hollywood movies like The Terminator.
In fact, the arms race is already underway, although it is largely undeclared. For example, the US Department of Defense has a US$18 billion weapons program in development and most of them are autonomous.
However, there is now considerable international political pressure for such a ban. At least 19 governments, including those of Pakistan, Mexico, Zimbabwe, Cuba and the Vatican, have formally called for a ban, and Human Rights Watch is leading a group of non-government organisations in a “Campaign to Stop Killer Robots.”
In December, nine members of the US House of Congress wrote to Secretary of State John Kerry and Defense Secretary Ash Carter calling for the US to vote for a ban at a UN conference on disarmament in Geneva that month. In their letter they said lethal robots “would not simply be another weapon in the world’s arsenals, but would constitute a new method of warfare.” To give their claim an historical framework, it is important to understand we are contemplating a third revolution in the history of warfare. The first was the invention of gunpowder by the Chinese, the second was the invention of the nuclear bomb, and the third – if we let it happen – will be autonomous weapons. Each is a steep change in the speed and efficiency with which we can kill the other side.
There are many problems. One is we don’t know how to build ethical robots. Another is that we don’t know how to build robots that can’t be hacked. That means such weapons can easily fall into the hands of terrorists and rogue nations. These people will have no qualms about removing any safeguards, or using them against us.
“The US Department of Defense has a US$18 billion weapons program in development and most of them are autonomous.”
And it won’t simply be robots fighting robots. Conflicts today are asymmetric. It will mostly be robots against humans. Unlike what some proponents might claim, many of those humans will be innocent civilians.
But governments still have time to choose a different future. The world has decided collectively not to weaponise other technologies. We have bans on biological and chemical weapons. Most recently, we banned certain types of blinding lasers and anti-personnel mines.
These bans have not prevented related technologies from being developed.
If you go into a hospital today, a ‘blinding’ laser will actually be used to fix your eyes. But arms companies will not sell you one. And you will not find them on any battlefield.
The same should be true for autonomous weapons. Any ban would not stop the development of the broad technology that has many other positive uses, like in autonomous vehicles.
But if we get a UN ban in place, autonomous weapons will have no place on the battlefield.
Last December in Geneva, 123 nations met for the Fifth Review Conference of the UN Convention on Certain Conventional Weapons and agreed to begin formal discussions on a possible ban of lethal, autonomous weapons. Those talks will begin in April or August, and 88 countries have agreed to attend.
Australia has led the way in many arms control negotiations – the nuclear non-proliferation treaty, and those around biological and chemical weapons. But Australian diplomats are some of the most resistant in the discussions about autonomous weapons. And we don’t have long. If these technologies get a foothold in our militaries, a Pandora’s box will be opened and we won’t be able to close it.
Our future is full of robots and intelligent machines. We can choose a good path, where these machines will take the sweat and we will be healthier, wealthier and happier. But if we choose another path that allows computers to make decisions that only humans should make, we risk giving up an important part of our humanity.
In his famous novel 2001, A Space Odyssey, the novelist Arthur C. Clarke delivered one of science fiction’s most prescient quotes. When the astronaut orders the onboard computer ‘HAL’ to disconnect itself, HAL replies: “I’m sorry, Dave. I’m afraid I can’t do that.” The time has come for humans to assert themselves and say to the computers: “Sorry, I can’t let you do that.”
To read the full article visit: unsw.edu.au/news/
About Professor Walsh
Toby Walsh is a leading researcher in Artificial Intelligence. He was recently named in the inaugural Knowledge Nation 100, the one hundred "rock stars" of Australia's digital revolution. He is Guest Professor at TU Berlin, Scientia Professor of Artificial Intelligence at UNSW and leads the Algorithmic Decision Theory group at Data61, Australia's Centre of Excellence for ICT Research. He has been elected a fellow of the Australian Academy of Science, and has won the prestigious Humboldt research award as well as the 2016 NSW Premier's Prize for Excellence in Engineering and ICT. He has previously held research positions in England, Scotland, France, Germany, Italy, Ireland and Sweden. He regularly appears in the media talking about the impact of AI and robotics.
Suggested Items
Defense Speak Interpreted: It’s Time for a ‘Defense-Speak’ Update
03/18/2025 | Dennis Fritz -- Column: Defense Speak InterpretedI’m Denny Fritz, the author of 30 “Defense Speak Interpreted” columns from 2018 to 2022. My original concept for this monthly op-ed was to explain the unique acronym language used by the U.S. Defense Department: CMMC, DIUx, C4ISR, JEDI, etc. When I started consulting for NSWV Crane almost 20 years ago, I would find myself sitting in meetings, perplexed at what I heard. Half the conversations were using acronyms that were foreign to me.
Curtiss-Wright Receives $18 Million Follow-On Order from U.S. Marine Corps
03/07/2025 | BUSINESS WIRECurtiss-Wright’s Defense Solutions Division today announced it has received a follow-on order from the U.S. Marine Corps, PEO Land Systems, via Defense Logistics Agency - Tailored Logistic Support (DLA-TLS) contract, to provide its Modular Open Systems Approach (MOSA) based tactical communications technology for use in the Application Server Module (ASM), part of the Combat Data Network (CDN) program.
Lockheed Martin, Nokia, and Verizon Advance Defense Capabilities Through 5G.MIL® Collaboration
03/04/2025 | Lockheed MartinLockheed Martin, Nokia, and Verizon announced the successful integration of Nokia's industry-leading, military-grade 5G solutions into Lockheed Martin's 5G.MIL® Hybrid Base Station (HBS).
SkyWater to Acquire Infineon’s Austin Fab and Establish Strategic Partnership To Expand U.S. Foundry Capacity for Foundational Chips
02/26/2025 | InfineonSkyWater Technology has entered into an agreement with Infineon Technologies AG for SkyWater to purchase Infineon’s 200 mm fab in Austin, Texas, (Fab 25) and a corresponding long-term supply agreement.
Vuzix Ships Customized Waveguides to a Major US-based Defense Customer
02/25/2025 | PRNewswireVuzix Corporation, a leading supplier of smart glasses, waveguides and augmented reality (AR) technology, announced that it has fulfilled a six-figure follow-on production order for customized waveguides. These waveguides will be used to enable a lightweight heads-up display (HUD) for a key defense program.