Affiliation:
1. Delft University of Technology
Abstract
'... our AI systems must do what we want them to do.''
This quote is mentioned in the Open Letter: Research Priorities for Robust and Beneficial Artificial Intelligence (AI) (Future of Life Institute, 2016) signed by over 8.600 people including Elon Musk and Stephan Hawking. This open letter received a lot of media attention with news headlines as: '
Musk, Wozniak and Hawking urge ban on warfare AI and autonomous weapons
' (Gibbs, 2015) and it fused the debate on this topic. Although this type of 'War of the Worlds' news coverage might seem exaggerated at first glance, the underlying question on how we ensure that our Autonomous Weapons remain under our control, is in my opinion one of the most pressing issues for AI technology at this moment in time.
To remain in control of our Autonomous Weapons and AI in general, meaning that its actions are intentional and according to our plans (Cushman, 2015), we should design it in a responsible manner and to do so I believe we must find a way incorporate our moral and ethical values into their design. The ART principle, an acronym for
Accountability, Responsibility
and
Transparency
can support a responsible design of AI. The Value-Sensitive Design (VSD) approach can be used to cover the ART principle. In this essay, I show how Autonomous Weapons can be designed responsibly by applying the VSD approach which is an iterative process that considers human values throughout the design process of technology (Davis & Nathan, 2015; Friedman & Kahn Jr, 2003).
Publisher
Association for Computing Machinery (ACM)
Reference41 articles.
1. Armed military robots: editorial
2. On banning autonomous weapon systems: human rights, automation, and the dehumanization of lethal decision-making
3. The social dilemma of autonomous vehicles
4. Campaign Stop Killer Robots. (2015). (Vol. 2017) (Web Page No. 15-07-2017). Retrieved from https://www.stopkillerrobots.org/ Campaign Stop Killer Robots. (2015). (Vol. 2017) (Web Page No. 15-07-2017). Retrieved from https://www.stopkillerrobots.org/
Cited by
3 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献