Taking Aim At Killer Robots

As tensions rise between the superpowers of the world, more and more politicians and citizens alike are becoming involved in military statuses and evolving technologies. One such development is the rising trend of fully autonomous weapons. 

More commonly referred to as “killer robots,” fully autonomous weapons systems (FAWS) are defined as weapons able to select and fire upon targets based on artificial intelligence technologies without human intervention. It is important to differentiate these from other systems such as remote-controlled weapons systems, which are controlled by humans from a separate location rather than full autonomy. 

Given the rapid evolution of these systems, groups such as the United Nations have begun discussing the international use and possible repercussions they could bring, and many countries already are disagreeing on their usage. Several political organizations such as “Campaign to Stop Killer Robots” have been founded to combat the implementation of FAWS in militaries around the world.

While FAWS initially seem like a wonderful plan for military development in the post-modern Information Age, it is important that all countries examine the effects they could have on people’s lives and work together to set restrictions on their usage before any harm can be done.

Robots have been controversial since they first entered the public eye through the world of science-fiction novels and films. The earliest introductions were mainly positive and displayed robots as helpful, kind and sometimes submissive pseudo-”creatures,” like Rosie, a humorous maid for a futuristic family in the TV show “The Jetsons” and “Astroboy,” which depicted robots living peacefully with humans. However, as robots slowly became more popularized in the media, films began to display them as possibly evil beings, such as the droids in “Star Wars Episode V: The Empire Strikes Back” and Locutus of Borg from “Star Trek: The Next Generation.”

Now, robots are very common in film and are far less stigmatized than when they were first introduced, but with the actual development of technologies that were once simply fantasized, they have become controversial yet again. 

Implementation of robots in daily life has become much like a virtual-reality videogame that never ends. With the pressure to take a stance rising so quickly, many countries have made official statements and passed legislation regarding the use of FAWS. In a statement to the United Nations Human Rights Council on May 30, 2013, France took a negative stance.

“France would like to state that it does not possess and does not intend to acquire robotized weapons systems with the capacity to fire independently. Our concept is based on the full responsibility of military and political leaders in the decision to use armed force. France believes that the role of human beings in the decision to open fire must be preserved.”

Some countries, such as the United States, however, have implemented FAWS and have created sets of regulations on their usage. U.S. Department of Defense Directive 3000.09 states “Autonomous and semi-autonomous weapon systems shall be designed to allow commanders and operators to exercise appropriate levels of human judgment over the use of force,“ with specific guidelines that allow FAWS to “function as anticipated in realistic operational environments against adaptive adversaries” and “(be) sufficiently robust to minimize failures that could lead to unintended engagements or to loss of control of the system to unauthorized parties.”    

Defense Department Spokesman Roger M. Cabiness II stated that only commanders can deploy “precision-guided weapon systems with homing functions” in order to “reduce the risk of civilian casualties.”

There are two main topics of controversy with the implementation of FAWS in military conflict: legality and morality.  

In a statement at United Nations’ Convention on Certain Conventional Weapons, Secretary-General of the United Nations Ban Ki Moon said, “Is it morally acceptable to delegate decisions about the use of lethal force to such systems? If their use results in a war crime or serious human-rights violation, who would be legally responsible? If responsibility cannot be determined as required by international law, is it legal or ethical to deploy such systems?”

As FAWS are being produced and tested internationally, it is important that we as an international community lay down laws and regulations for every country alike before any problems may arise. 

Due to the experimental qualities of these weapons, there must be overseers of early usage in controlled settings so that detailed explanations of the weapons can be distributed.  

The real problem for the majority of citizens is the morality of the use of these weapons, because the lives of people — whether enemy or friend — are at the hands of the artificial intelligences of robots rather than the adaptable brains and consciences of humans. It is easy to get behind using FAWS against enemy forces because we typically don’t emotionally attach to the soldiers our troops fight, but what if it was the enemy force using FAWS on your family or friends?     

This is an extremely elevated example of the golden rule:    “Do unto others as you would have them do unto you.” If we as a people are not OK with having these weapons used on ourselves, then how can we morally support using them on another people? Still, it isn’t necessarily wrong to think selfishly in this. 

The use of FAWS does save a designated group’s lives, as the FAW would act as a lifeless sacrifice so that the troops could be deployed elsewhere or in the future, possibly not needed at all. While the use of these weapons would potentially save the lives of those who would normally be sent into battle to do the jobs taken by the FAWS, it would take almost as many with the opposing military because of the FAWS’ prolonged endurance.

Fully autonomous weapons are today’s “easy way out” for military weaponry, and as they rapidly develop over the next years, countries everywhere must be made aware of the chances they take in their usage.