Abstract
AbstractRecent approaches to providing advisory knowledge-based systems with explanation capabilities are reviewed. The importance of explaining a system's behaviour and conclusions was recognized early in the development of expert systems. Initial approaches were based on the presentation of an edited proof trace to the user, but while helpful for debugging knowledge bases, these explanations are of limited value to most users. Current work aims to expand the kinds of explanation which can be offered and to embed explanations into a dialogue so that the topic of the explanation can be negotiated between the user and the system. This raises issues of mutual knowledge and dialogue control which are discussed in the review.
Publisher
Cambridge University Press (CUP)
Subject
Artificial Intelligence,Software
Reference54 articles.
1. BLAH, a system which explains its reasoning;Weiner;Artificial Intelligence,1980
2. XPLAIN: a system for creating and explaining expert consulting programs
3. A digitalis therapy advisor with explanations;Swartout;Proceedings of the 5th IJCAI,1977
4. Explanation capabilities of production-based consultation systems;Scott;American Journal of Computational Linguistics,1977
Cited by
18 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献
1. Safety and Military Artificial Intelligence;Cognitive Systems Monographs;2021
2. Explanation;Cognitive Systems Monographs;2021
3. Explainability in human–agent systems;Autonomous Agents and Multi-Agent Systems;2019-05-13
4. Evaluation of neural network variable influence measures for process control;Engineering Applications of Artificial Intelligence;2011-08
5. Do You Get It? User-Evaluated Explainable BDI Agents;Multiagent System Technologies;2010