Affiliation:
1. Univ. of California, Irvine, CA
Abstract
In the United States, artificial intelligence (AI) research is mainly a story about military support for the development of promising technologies. Since the late 1950s and early 196Os, AI research has received most of its support from the military research establishment [37, 55].
1
Not until the 1980s, however, has the military connected this research to specific objectives and products. In 1983, the $600-million Strategic Computing Program (SCP) created three applications for "'pulling' the technology-generation process by creating carefully selected technology interactions with challenging military applications" [16]. These applications, an autonomous land vehicle, a pilot's associate, and a battle management system, explicitly connect the three armed services to further AI developments [29, 51, 53]. The Defense Science Board Task Force on the "Military Applications of New-Generation Computer Technologies" recommended warfare simulation, electronic warfare, ballistic missile defense and logistics management as also promising a high military payoff [18].
In his 1983 "Star Wars" speech, President Reagan enjoined "the scientific community, . . . those who gave us nuclear weapons, . . . to give us the means of rendering these nuclear weapons impotent and obsolete" [43]. As in the Manhattan and hydrogen bomb projects, AI researchers and more generally computer scientists are expected to play major parts in this quest for a defensive shield against ballistic missiles. Computing specialists such as John von Neumann played a supportive role by setting up the computations necessary for these engineering feats—with human "computers" for the atom bomb [10]
2
and with ENIAC and other early computers for the hydrogen bomb [9]. The "Star Wars" project challenges computer scientists to design an intelligent system that finds and destroys targets—basically in real-time and without human intervention.
The interdependence of the military and computer science rarely surfaces during our education as computer practitioners, researchers, and teachers. Where might information concerning these important military applications enter into computer science and AI education? Where do students receive information concerning the important role they may play in weapon systems development? One of our students recently remarked that "as a computer science major, I did not realize the magnitude of the ramifications of advancing technology for the military . . . . In a field so dominated by the DoD, I will have to think seriously about what I am willing and not willing to do—and what lies in between those two poles."
3
As researchers and educators, the authors wish to encourage colleagues and students to reflect upon present and historical interactions between computer science as an academic discipline and profession, and military projects and funding. As computer professionals, we lay claim to specialized
knowledge
and employ that knowledge in society as developers of computing technologies. Thus, we exercise
power
. Recognizing that as professionals we wield power, we must also recognize that we have
responsibilities
to society. To act responsibly does not mean that computer professionals should advocate a complete separation between computer science and military missions. However, we should openly examine the inter-relationships between the military and the discipline and practice of computing. To act responsibly does not mean that computer scientists and practioners should eschew support or employment from the military, although some are justified in taking such a stance.
4
To act responsibly requires attention to the social and political context in which one is embedded; it requires reflection upon individual and professional practice; it requires open debate. The lack of attention to issues of responsibility in the typical computer science curriculum strikes us as a grave professional omission. With this article, we hope to add material to the dialogue on appropriate computing applications and their limits. We also hope to provoke reflections on computing fundamentals and practice at the individual, professional, and disciplinary levels, as well as prodding government institutions, professional societies, and industry to support in-depth research on the issues we raise here.
Reflection requires information and discussion. Academic computer science departments rarely support serious consideration of even
general
issues under the rubric of the social and ethical implications of computing. Unlike any other U.S. computer science department, Information and Computer Science (ICS) at UC Irvine has an active research program in the social implications of computing (Computers, Organizations, Policy and Society—CORPS). Even within CORPS, research that addresses the interactions between the military and computer science is difficult to pursue—not because individuals aren't interested, but because they are not able to find professional or academic support. The authors' interests in these issues arose from personal concerns over the dependence of military systems upon complex technology, and the possible grave outcomes of this fragile relationship. CORPS provided a supportive intellectual environment that allowed us to pursue our interests. In 1987, we developed and taught an undergraduate course designed to inform students about military applications and their limits, and allow dialogue on professional responsibilities. In general, little monetary support is available for research that considers these issues, and it is only through support from the Institute on Global Conflict and Cooperation and campus instructional funds that we were able to develop and teach the course.
Few researchers or educators can devote time and/or energy to pursue the social and ethical implications of their work and profession, in addition to their "mainstream" research. Since the discipline of computer science does not consider these reflections serious "mainstream" research, those who chose to pursue these vital questions have difficulties finding employment and/or advancing through the academic ranks. Growing concern over these issues and interest by computer scientists, as evidenced by the group Computer Professionals for Social Responsibility [38], individuals such as David Parnas [39], and this article, may lead to future research support and academic recognition.
For now, as concerned professionals, we offer the following reviews. They pose many more questions than answers. This article exemplifies the interdisciplinary investigations which are required as precursors to serious analysis of computing use in these applications. We hope that our reviews generate discussion and debate. In the first section, we present the course rationale and content, as well as student responses. In the sections following the course description, we consider three applications—smart weapons, battle management, and war game simulations—that are generating research and development funds and that have controversial implications for military uses of computing. We start with smart weapons, that is, the development of weapons that can destroy targets with minimal human intervention. Next we look at battle management systems designed to coordinate and assess the use of resources and people in warfare. Finally, we turn to war gaming as a means for evaluating weapon performance and strategies for war fighting. In each case, we describe the state of technology, its current and potential uses and its implications for the conduct of war.
Publisher
Association for Computing Machinery (ACM)
Reference69 articles.
1. Advanced military computing 1 5 (1985) 1 2 5-7. Advanced military computing 1 5 (1985) 1 2 5-7.
2. Advanced military computing 1 6 (1985) 4. Advanced military computing 1 6 (1985) 4.
3. Advanced military computing 2 9 (1986) 5. Advanced military computing 2 9 (1986) 5.
4. Advanced military computing 2 10 (1986) 2. Advanced military computing 2 10 (1986) 2.
5. Advanced military computing 2 17 (1986) 3-4. Advanced military computing 2 17 (1986) 3-4.