Regarding Fear and Artificial Intelligence (AI), one question often comes up:‘Will we be killed by a Terminator Doppelganger?’
I don’t know if this will happen eventually, but I do know that we already have robots fighting our wars. This century is therefore, the first time in human history that we engage in Unmanned Warfare. What is the current status of this ‘Unmanned Warfare’? What do people think about drone strikes and will terminators be the next step?
DARPA (Defense Advanced Research Projects Agency) is financing AI research since the sixties. All this research has resulted in Drones, Tactical Ground Robots, Unmanned Combat Ground Vehicles (UCGV) and much more functionalities. These robots use AI to navigate, to collaborate and to help identify targets (to kill, follow, etc.).
An Example: The Uran-9
I hear you say: that’s interesting and all, but what are we exactly talking about?
Well, the Uran-9, as shown above, is an example of an Unmanned Combat Ground Vehicle (UCGV). We are talking about a remotely controlled machine with a 7.62 machine gun and four 9M120 Ataka anti-tank missiles. Just to be clear: there is nobody sitting in this machine, it is operated elsewhere.
Imagine yourself fighting against this machine. Standing in front of it. It has the power to kill you in an instant. And you can do your best to destroy it, but there are different things at stake. You can lose your life, your family can lose a partner, father/mother, uncle/aunt, friend. The enemy can lose an expensive piece of metal. So I ask you: A machine that is trying to kill you, how is that different to fighting against humans?
This war machine, The Uran-9, was revealed in 2015 and is developed and produced by the Russian State company Rostec. This machine is meant for the international market. There are more models with other specialties. Besides these Unmanned Combat Ground Vehicle, there are the most famous drones. See below for a picture of one.
So it’s clear that this is no futuristic fantasy talk. This is happening right now and according to reports from the U.S. military department AI systems will play an even more important role in future battlefields.
Drone Use in Wars
How have these armed robots been used in wars so far? Unmanned Aerial Vehicles (UAV / drones) have played the biggest role so far.
President Obama started his presidency with the goal to become the anti-war president. Sadly his presidency has seen wars in Afghanistan, Irak, Syria, Pakistan, Libya, Yemen, and Somalia. In the latter four countries, almost all military force was through the use of drones which executed suspected terrorists.
The U.S. is not alone; around nineteen countries are in the possession of armed drones or are in the process of acquiring them. There are around eight countries that already used armed drones in conflicts (depending on the exact definition of armed drone): Israel, U.K., Iran, V.S., Turkey, Ira, Nigeria, Pakistan.
Marketing research agency WinterGreen Research sees the worldwide market for military drones (e.g. surveillance and bombing) growing from $4.4 billion in 2015 to $6.8 billion in 2022. If you are interested: China is selling armed drones, the Wing Loong, for $1 million.
Public Opinion on Drone Strikes
What do civilians think about sending robots to wars?
In 2012 a survey showed that 90% of the Pakistanis where against drone strikes and 74% saw the U.S. as the enemy, despite Pakistan receiving the second most aid from the U.S. A poll from Pew Research Center in 2014 showed an immense disproval of drone strike around the world. It showed that in 39 of 44 surveyed countries there was a strong disapproval for drone strikes. Only in three countries, Kenia, Israel, and the U.S., a majority of the surveyed people approved drone strikes.
The thirteen-year-old Pakistan Zubair expressed the fear for the drones vividly:
One of the few remaining innocent phenomena on this earth, the blue sky, symbol for a good day, now causes fear for civilians. Luckily or sadly this is not entirely correct anymore; the drones fly through all kinds of weather.
When the drones create a perception of American arrogance and stimulate a deep hate, the drone strike may even be counterproductive. Professor International relations Audrey Cronin writes in the collection Drones and the Future of Armed Conflict about the limits of drone warfare. Drones can’t help with electing a new leader, they can’t stop propaganda and can’t do anything against local attacks.
Current Armed Robots vs Terminators
So we know that drones are already being used for quite some years and that armed robots on the ground are on the move. What is the difference with the Terminators?
The biggest difference between these combat drones and the terminator scenario is that there is still a person who decides to attack instead of the drone**. We don’t have registered cases of Lethal Autonomous Robots (LAR) or Autonomous Weapons where the intelligent system selects and the engages on targets without human intervention.
Stopping an Autonomous Weapons Arm Race
On the 28th of July 2015 the Future of Life Institute published an open letter to AI researchers. They ask the researchers to stand against an autonomous weapons arm race. In contrary to nuclear weapons there are no costly or hard-to-obtain raw materials needed for autonomous weapons. Once such a weapon is invented it will be relatively cheap to make and use. To date, the open letter has been signed by 20.806 people, including prestigious researchers from IBM, Microsoft, Berkely and Oxford University. Other endorsers are Elon Musk, Stephan Hawking, Steve Wozniak, Noam Chomsky and more. The letter concludes:
‘In summary, we believe that AI has great potential to benefit humanity in many ways, and that the goal of the field should be to do so. Starting a military AI arms race is a bad idea, and should be prevented by a ban on offensive autonomous weapons beyond meaningful human control.’
Pro’s and Con’s of LAR’s
Professor Ronald Craig Arkin is roboticist en roboethicist at the Georgia Institute of Technology. In his paper ‘Ethical Robots in Warfare’ Professor Arkin names the pro’s and con’s of Lethal Autonomous Robots. This is a selection:
1) LAR’s can sacrifice themselves. For example when the identification of the target is uncertain.
2) LAR’s can be designed without emotions. No fear and anger in the battlefield. This could lead to less criminal behaviour.
3) LAR’s can process more information and faster than humans. In split second decisions this can make the difference between shooting or not shooting.
4) When working in combined teams (LAR’s and humans) the LAR’s can give objective reports to superiors. This can lead to better ethical behaviour.
1) Who has the responsibility when something goes wrong?
2) The threshold of entry into warfare may be lowered as we will now be risking machines and fewer human soldiers.
3) LAR’s could be hacked or fall into the wrong hands.
4) A robot refusing an order – the question of whether the ultimate authority should vest in humans.
What do you think? Do the Pro’s outweigh the Cons?
As with a lot of technologies these days I think it’s safe to say: ‘You ain’t seen nothing yet’. We will see a lot of developments in warfare and these developments will raise questions and shift balances. For myself, I can say that the open letter against autonomous weapons convinced me that we should never start building autonomous weapons. But on the other hand, two questions that Professor Arkin asks are still resonating in my head:
“Should soldiers be robots? – Isn’t that largely what they are trained to be?”
“Should robots be soldiers? Should they be more humane than humans?”
* & ** While there is an alienation of the killing process because the person who decides to kill is thousands of miles away, the actions still have an impact on the soldier. A research report from the United States Air Force under 1084 drones operators states that “a total of 4.3% endorsed a pattern of symptoms of moderate to the extreme level of severity meeting criteria outlined in the Diagnostic and Statistical Manual of Mental Disorders-4th edition. The incidence of Posttraumatic Stress Disorder (PTSD) among USAF drone operators in this study is lower than rates of PTSD (10–18%) among military personnel returning from deployment”. Watch this Predator Drone Missile Strike to get a glimpse of the experience.
Pic 1 : Unmanned Warfare: Uran-9; Unmanned Combat Ground Vehicle (UCGV). Photo Source: https://www.youtube.com/watch?v=VBC9BM4-3Ek
Pic 2: Unmanned Warfare: Wing Loong Drone. Photo Source: www.drone-report.com/News/2014/Apr/26/images/img0125.jpg
About Thijs Pepping
Thijs Pepping is a humanistic trend analyst in the field of new technologies. He is part of the think tank within SogetiLabs and in his work he continuously wonders and analyses what the impact of New Technologies is on our lives, organizations and society. He specialized in Humanistic Counselling and Education at the University of Humanistics in Utrecht and worked for five years with autistic children. His background in psychology and philosophy drives him to find meaningful answers to business related questions and to provoke whenever necessary. He is co-author of multiple publications on the impact of new technologies, such as ‘The FrankensteinFactor’, ‘AI First – Learning from the machine’, and ‘The Pursuit of Digital Happiness’ series. See labs.sogeti.com/research for his previous and current work. VINT provides practical insight into the likely impact and innovative applications of new technologies for organizations worldwide. This valuable intelligence helps public and private sector enterprises to anticipate and plan for the complex dynamics of the future. The use of new technological developments is aimed at generating value that anticipates future developments.
More on Thijs Pepping.