Today, I am going to break from my story line, and write about another topic that has been on my mind and heart: the ethical and legal responsibilities for engineers who design Lethal Autonomous Weapons Systems (LAWS).
A few months ago, I was with a group of Quakers discussing LAWS policies, which is a Quakerly thing to do. You see, Quakers do not have a creed, but if they did, it would likely be that there is God in every person. We regularly discern what this means for us and how to live our lives accordingly. Quakers are well known for being the first religious organization to condemn slavery. We have fought for women’s rights and the ability to be given conscientious objector status in wars. We see our advocacy and peace building work directly intertwined with our spirituality and ministry in the world. There is even a Quaker lobbying group in Washington DC, the Friends Committee on National Legislation (FCNL), of which I am on the governing body (along with more than 200 individuals). If you are interested in learning more about policies Quakers support, I recommend that you read FCNL’s policy statement: The World We Seek.
So the other day, a group of Friends (a.k.a. Quakers) were talking about autonomous weapons systems and associated policies. LAWS are robots that can identify a target, and destroy it without any human intervention. Humans have no control over these robots when they are active. These robots have not yet been created; drones today either have humans controlling them, or humans have the ability to over-ride them. The key distinction being that LAWS do not need humans through out their process, and the United States is on a path to develop them.
Human Rights Watch and Harvard Law School’s International Human Rights Clinic (IHRC) argue for banning LAWS technology because such “’weapons would not only be unable to meet legal standards but would also undermine essential non-legal safeguards for civilians.” Their research suggests that the technology could not meet the ‘Laws of War’ that were established at the Geneva Convention. These laws protect civilians from death and terror. They claim that these weapons would be unable to detect if a person is a civilian and would be unable to act based on emotions and compassion in a situation, and this violates the Laws of War.
The Heritage Foundation, which argues in favor of the development of LAWS, states that the United States need to pursue the technology to maintain military advantage. The Foundation says that the technology will be either used in combat zones with no civilians and advance to a point where human rights of the civilians would not be violated, so essentially, the weapons would not kill or terrorize civilians.
Essentially, the main arguments surrounding the use of LAWS comes down the question of whether or not the technology can be designed to meet legal regulations required by UN laws for international human rights, if the weapons will be able to determine if the target is a civilian. This is a question that needs to be discerned by engineers, while also contemplating whether the engineers are responsible for the civilian deaths that do occur? This leads me to question: if technology results in human rights violations, should the engineers, designers, and managers be held responsible for war crimes?
This may seem like a weird tangent, but I see this question connected to the recent VW emission scandal. Last month it came to light that VW developed and implemented technology that cheated emission tests, resulting in higher emissions than published technical specifications. The software within VW cars resulted in less emission when it was being tested compared to its normal use. The United States Department of Justice is pursuing legal recourse, and the manufacturer undoubtedly will be required to pay fines and fix the vehicles.
In the VW scandal there is one thing that is clear, engineers did something wrong. This is already being declared an engineering ethics case, pinning the problem to both the role of the individual engineer and the culture of engineering. Brian Benchroff, a writer for Hackaday, points to the individual engineer’s responsibility stating, “[s]omeone with the authority to say ‘no’ didn’t, and this code was installed in the electronic control unit of millions of cars. This is the teachable moment of this entire ordeal; at some point, someone who should have known better. At least one engineer will lose their job over this, and certainly more than one executive will be hung out to dry” [sic]. Benchroff sees the need for the individual to step up and do what is “right.” Shannon Vallor says to IEEE Spectrum that it was the culture of engineering ethics which resulted in the VW case. Engineers tend to link ethics to be connected to “externally enforced rules” that need to be “checked off.” Ultimately, this norm “implies that as long as you don’t get caught violating rules, there’s no harm.” Vallor suggest re-structuring engineering ethics at the university level. Paul Kedrosky from the New Yorker points to the nature of engineering organizations. Kedrosky suggests that the engineers made small tweaks over time that resulted in significant changes in emissions outputs.
I believe the software engineers who were creating the code most likely knew what was happening. The managers signed off on the software design, and the design came to production. I’ve thought about what may have been going through the minds of these engineers. The desire to solve a problem can be so enticing that it can be deeply fulfilling to create a solution in the presence of design constraints. In Germany, engineers often optimize for quality, even at the expense of efficiency and cost. They may have been thinking about how to create a quality car that passed emissions testing—and that is what they did. But in doing so, they broke the law, and they eventually got caught. They may not have even known or understood the law they were breaking during the design process.
VW will no doubt be turned into a pivotal engineering ethics case from here on out. There is a certain amount of simplicity that engineers tend to love. There is a clear legal violation, and the blame seems obvious from afar. Undergraduate students will be reading a one- or two-page synopsis outlining the context, and writing papers about the ethical dilemmas and the importance to follow environmental regulations. This can be created as a supplemental assignment for a mechanical engineering course, and bring a bit of social responsibility. This is a good start, but in engineering education, this is too often where it ends.
My Answer to the question – If technology results in human rights violations, should the engineers, designers, and managers be help responsible for war crimes?
In the VW case, it is clear that the engineers involved in designing the emission system are to blame, yet, at least for me, the answer is more uncertain in when it comes to the autonomous weapons, and I do not know exactly why.
The software in VW case was ultimately designed to cheat laws. This may have been a result of a few “rouge” engineers, a culture breaking rules as long as you don’t get caught, or a series of small errors. It is the engineers’ responsibility to make sure that the system was able to meet the legal design requirements, and if it was unable to, the cars should not have been produced nor gone to market. The fact that they did demonstrates that the engineers were in direct violation and should me held legally responsible. On that same note, if engineers are unable to design autonomous weapons that result in harm or terror to a civilian, the technology should be deemed illegal. If the technology does “go to market,” the engineers then should be legally responsible for the outcomes.
I must mention that this argument upsets me. It takes away the people being killed, the lives being terrorized, and the sleepless nights experienced. The people on the other side of the technology are real people. They have love ones, children, and mothers. When you are sitting in a cubical somewhere and working out very impressive code to produce one of these weapons, it is easy to forget the impacts of the weapon. We as humans separate ourselves from war. It used to be that soldiers were on the front lines pulling a trigger, but as technology moves forward, it’s the engineer who is.
In bringing up the question of whether or not engineers are responsible for crimes enabled by their work, I could not imagine a world where the answer is “yes, definitely.” There can be extreme situations in which engineers are blamed for something that was completely out of their control. Consider how the legal system might hold engineers criminally responsible for design. Could a chair that was used in a crime be accused of negligent design because the material was rigid enough to cause blunt force trauma? Yet, I believe there is a case to be made for looking at the materials selected to ensure that a chair is not off-gassing contaminants or producing negative health impacts. If the chair is doing exactly what it intended to do, and is causing harm to a person or the environment, then there is a problem and the engineer should hold some responsibility.
I also worry that the answer will be a “no.” Engineers should not be responsible for the end-use of the technology they create. This is similar to the “guns don’t kill people, people kill people” argument. I do not buy this argument. I think all designers and engineers need to contemplate and struggle with their impact on society, and take responsibility for their outcome. The gun itself kills when it was designed to without a human pulling the trigger, and then the designer of the gun holds some responsibility.
The image or file is a work of a Defense Advanced Research Projects Agency (DARPA), an agency of the United States Department of Defense, employee, taken or made as part of that person’s official duties. As a work of the U.S. federal government, the image is in the public domain.