by Gary E. Marchant, Braden Allenby, Ronald Arkin, Edward T. Barrett, Jason Borenstein, Lyn M. Gaudet, Orde Kittrie, Patrick Lin, George R. Lucas, Richard O’Meara, and Jared Silberman
12 Colum. Sci. & Tech. L. Rev. 272 (2011) (Published June 2, 2011)
New technologies have always been a critical component of military strategy and preparedness. One new technology on the not-too-distant technological horizon is lethal autonomous robotics, which would consist of robotic weapons capable of exerting lethal force without human control or intervention. There are a number of operational and tactical factors that create incentives for the development of such lethal systems as the next step in the current development, deployment and use of autonomous systems in military forces. Yet, such robotic systems would raise a number of potential operational, policy, ethical and legal issues. This article summarizes the current status and incentives for the development of lethal autonomous robots, discusses some of the issues that would be raised by such systems, and calls for a national and international dialogue on appropriate governance of such systems before they are deployed. The article reviews potential modes of governance, ranging from ethical principles implemented through modifications or refinements of national policies, to changes in the law of war and rules of engagement, to international treaties or agreements, or to a variety of other “soft law” governance mechanisms.
About the Authors
The authors of are members of the Autonomous Robotics thrust group of the Consortium on Emerging Technologies, Military Operations, and National Security (CETMONS), a multi-institutional organization dedicated to providing the basis for the ethical, rational, and responsible understanding and management of the complex set of issues raised by emerging technologies and their use in military operations, as well as their broader implications for national security. Gary Marchant is the Lincoln Professor of Emerging Technologies, Law & Ethics and Executive Director of the Center for Law, Science & Innovation at the Sandra Day O’Connor College of Law, Arizona State University. Braden Allenby is the Lincoln Professor of Engineering and Ethics at Arizona State University and the Founding Chair of CETMONS. Ronald Arkin is the Regents’ Professor, College of Computing, the Director of College of Computing Mobile Robot Laboratory and the Associate Dean for Research at Georgia Institute of Technology. Edward T. Barrett is the Director of Research at the U.S. Naval Academy’s Stockdale Center for Ethical Leadership and an ethics professor in the Department of Leadership, Ethics, and Law. Jason Bornstein is the Director of Graduate Research Ethics Programs and co-Director of the Center for Ethics and Technology at Georgia Institute of Technology. Lyn M. Gaudet is the Research Director of the Center for Law, Science & Innovation at the Sandra Day O’Connor College of Law, Arizona State University. Orde Kittrie is the Director of the Washington D.C. Program for the Sandra Day O’Connor College of Law, Arizona State University. Patrick Lin is an Assistant Professor, Department of Philosophy, California Polytechnic State University. George R. Lucas is the Class of 1984 Distinguished Chair in Ethics in the Stockdale Center for Ethical Leadership at the United States Naval Academy and a Professor of Ethics and Public Policy at the Graduate School of Public Policy at the Naval Postgraduate School. Richard M. O’Meara is a retired Brigadier General, USA, former Fellow Stockdale Center for Ethical Leadership, USNA, and Adjunct Faculty in Global Affairs, Rutgers University. Jared Silberman is Associate Counsel for Arms Control and International Law at the United States Navy Office of Strategic Systems Programs (SSP) in Washington, DC.
For proper legal citation of this document, please cite to the following URL: http://www.stlr.org/cite.cgi?volume=12&article=7. The URL that currently appears in your browser’s location toolbar is incorrect. For more information on Bluebook citation of internet sources, click here.