Regarding Fear and Artificial Intelligence (AI), one question often comes up:‘Will we be killed by a Terminator Doppelganger?’ I don’t know if this will happen eventually, but I do know that we already have robots fighting our wars. This century is therefore, the first time in human history that we engage in Unmanned Warfare. What is the current status of this ‘Unmanned Warfare’? What do people think about drone strikes and will terminators be the next step?
Unmanned Warfare
DARPA (Defense Advanced Research Projects Agency) is financing AI research since the sixties. All this research has resulted in Drones, Tactical Ground Robots, Unmanned Combat Ground Vehicles (UCGV) and much more functionalities. These robots use AI to navigate, to collaborate and to help identify targets (to kill, follow, etc.). An Example: The Uran-9![Unmanned Warfare: Uran-9; Unmanned Combat Ground Vehicle (UCGV). Photo Source: https://www.youtube.com/watch?v=VBC9BM4-3Ek](https://labs.sogeti.com/wp-content/uploads/sites/2/2016/11/Uran-9-300x157.png)
Drone Use in Wars
![](https://labs.sogeti.com/wp-content/uploads/sites/2/2016/12/iStock-506756273-1024x768.jpg)
![Unmanned Warfare: Wing Loong Drone. Photo Source: www.drone-report.com/News/2014/Apr/26/images/img0125.jpg](https://labs.sogeti.com/wp-content/uploads/sites/2/2016/11/Wing-Loong-300x122.jpg)
Public Opinion on Drone Strikes
What do civilians think about sending robots to wars? In 2012 a survey showed that 90% of the Pakistanis where against drone strikes and 74% saw the U.S. as the enemy, despite Pakistan receiving the second most aid from the U.S. A poll from Pew Research Center in 2014 showed an immense disproval of drone strike around the world. It showed that in 39 of 44 surveyed countries there was a strong disapproval for drone strikes. Only in three countries, Kenia, Israel, and the U.S., a majority of the surveyed people approved drone strikes.![Unmanned Warfare: Drone Use Disapproval](https://labs.sogeti.com/wp-content/uploads/sites/2/2016/11/drone-use-disapproval-pewglobal-113x300.png)
Current Armed Robots vs Terminators
So we know that drones are already being used for quite some years and that armed robots on the ground are on the move. What is the difference with the Terminators? The biggest difference between these combat drones and the terminator scenario is that there is still a person who decides to attack instead of the drone**. We don’t have registered cases of Lethal Autonomous Robots (LAR) or Autonomous Weapons where the intelligent system selects and the engages on targets without human intervention.Stopping an Autonomous Weapons Arm Race
On the 28th of July 2015 the Future of Life Institute published an open letter to AI researchers. They ask the researchers to stand against an autonomous weapons arm race. In contrary to nuclear weapons there are no costly or hard-to-obtain raw materials needed for autonomous weapons. Once such a weapon is invented it will be relatively cheap to make and use. To date, the open letter has been signed by 20.806 people, including prestigious researchers from IBM, Microsoft, Berkely and Oxford University. Other endorsers are Elon Musk, Stephan Hawking, Steve Wozniak, Noam Chomsky and more. The letter concludes:‘In summary, we believe that AI has great potential to benefit humanity in many ways, and that the goal of the field should be to do so. Starting a military AI arms race is a bad idea, and should be prevented by a ban on offensive autonomous weapons beyond meaningful human control.’ http://futureoflife.org/open-letter-autonomous-weapons/
Pro’s and Con’s of LAR’s
Professor Ronald Craig Arkin is roboticist en roboethicist at the Georgia Institute of Technology. In his paper ‘Ethical Robots in Warfare’ Professor Arkin names the pro’s and con’s of Lethal Autonomous Robots. This is a selection:Pro LAR:
1) LAR’s can sacrifice themselves. For example when the identification of the target is uncertain. 2) LAR’s can be designed without emotions. No fear and anger in the battlefield. This could lead to less criminal behaviour. 3) LAR’s can process more information and faster than humans. In split second decisions this can make the difference between shooting or not shooting. 4) When working in combined teams (LAR’s and humans) the LAR’s can give objective reports to superiors. This can lead to better ethical behaviour.Con’s LAR:
1) Who has the responsibility when something goes wrong? 2) The threshold of entry into warfare may be lowered as we will now be risking machines and fewer human soldiers. 3) LAR’s could be hacked or fall into the wrong hands. 4) A robot refusing an order – the question of whether the ultimate authority should vest in humans. What do you think? Do the Pro’s outweigh the Cons?End note
As with a lot of technologies these days I think it’s safe to say: ‘You ain’t seen nothing yet’. We will see a lot of developments in warfare and these developments will raise questions and shift balances. For myself, I can say that the open letter against autonomous weapons convinced me that we should never start building autonomous weapons. But on the other hand, two questions that Professor Arkin asks are still resonating in my head:“Should soldiers be robots? – Isn’t that largely what they are trained to be?” “Should robots be soldiers? Should they be more humane than humans?” ——————————— * & ** While there is an alienation of the killing process because the person who decides to kill is thousands of miles away, the actions still have an impact on the soldier. A research report from the United States Air Force under 1084 drones operators states that “a total of 4.3% endorsed a pattern of symptoms of moderate to the extreme level of severity meeting criteria outlined in the Diagnostic and Statistical Manual of Mental Disorders-4th edition. The incidence of Posttraumatic Stress Disorder (PTSD) among USAF drone operators in this study is lower than rates of PTSD (10–18%) among military personnel returning from deployment”. Watch this Predator Drone Missile Strike to get a glimpse of the experience. Documentary tip: ‘Unmanned: America’s Drone Wars’ (Full version on Youtube)
Scary stuff, Thijs, just before Christmas. But thanks anyway for the interesting food for thought.
Although in general I am optimistic about the benefits of new technology, I don’t really see much good in autonomous weapons. In an “old fashioned” fight, people tend to be carefull because they want to survive. But if the wish to survive is not there (as with terrorists and robots) violence can be unlimited.
The open letter from over 20.000 people to have a “ban on offensive autonomous weapon” is sympathetic but a bit naive, I am afraid. If such a ban would work, why not have a ban on any offensive weapon? That would really make a change.
So, looking for an optimistic scenario: maybe we are heading for a world where human beings just delegate all conflicts to machines, which do all the fighting for us. And we as human beings can all continu to live our lifes. Does thát sound realistic?
I am really curious to know what other people think about this subject.