Abstract
AbstractAccording to the objective Bayesian approach to inductive logic, premisses inductively entail a conclusion just when every probability function with maximal entropy, from all those that satisfy the premisses, satisfies the conclusion. When premisses and conclusion are constraints on probabilities of sentences of a first-order predicate language, however, it is by no means obvious how to determine these maximal entropy functions. This paper makes progress on the problem in the following ways. Firstly, we introduce the concept of a limit in entropy and show that, if the set of probability functions satisfying the premisses contains a limit in entropy, then this limit point is unique and is the maximal entropy probability function. Next, we turn to the special case in which the premisses are categorical sentences of the logical language. We show that if the uniform probability function gives the premisses positive probability, then the maximal entropy function can be found by simply conditionalising this uniform prior on the premisses. We generalise our results to demonstrate agreement between the maximal entropy approach and Jeffrey conditionalisation in the case in which there is a single premiss that specifies the probability of a sentence of the language. We show that, after learning such a premiss, certain inferences are preserved, namely inferences to inductive tautologies. Finally, we consider potential pathologies of the approach: we explore the extent to which the maximal entropy approach is invariant under permutations of the constants of the language, and we discuss some cases in which there is no maximal entropy probability function.
Funder
Deutsche Forschungsgemeinschaft
Leverhulme Trust
Università degli Studi di Milano
Publisher
Springer Science and Business Media LLC
Reference44 articles.
1. Balestrino, A., Caiti, A., & Crisostomi, E. (2006). Efficient numerical approximation of maximum entropy estimates. International Journal of Control, 79(9), 1145–1155.
2. Barnett, O., & Paris, J. B. (2008). Maximum entropy inference with quantified knowledge. Logic Journal of IGPL, 16(1), 85–98.
3. Billingsley, P. (1979). Probability and measure (3rd (1995) edn). New York: Wiley.
4. Carnap, R. (1952). The continuum of inductive methods. Chicago: University of Chicago Press.
5. Caticha, A., & Giffin, A. (2006). Updating probabilities. In Proceedings of MaxEnt (Vol. 872 pp. 31–42).
Cited by
5 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献