Affiliation:
1. Center for Mind/Brain Sciences, University of Trento,
Abstract
Corpus-based distributional semantic models capture degrees of semantic relatedness among the words of very large vocabularies, but have problems with logical phenomena such as entailment, that are instead elegantly handled by model-theoretic approaches, which, in turn, do not scale up. We combine the advantages of the two views by inducing a mapping from distributional vectors of words (or sentences) into a Boolean structure of the kind in which natural language terms are assumed to denote. We evaluate this Boolean Distributional Semantic Model (BDSM) on recognizing entailment between words and sentences. The method achieves results comparable to a state-of-the-art SVM, degrades more gracefully when less training data are available and displays interesting qualitative properties.
Cited by
9 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献
1. Exploring Logical Reasoning for Referring Expression Comprehension;Proceedings of the 29th ACM International Conference on Multimedia;2021-10-17
2. Cultural cartography with word embeddings;Poetics;2021-10
3. Ideal Words;KI - Künstliche Intelligenz;2021-05-25
4. The Criteria, Challenges, and the Back-Propagation Method;Studies in Computational Intelligence;2020-08-25
5. Variations on Abstract Semantic Spaces;The Philosophy and Science of Language;2020