Author:
Daley Daryl J.,Vere-Jones David
Abstract
Theentropy scoreof an observed outcome that has been given a probability forecastpis defined to be –logp.Ifpis derived from a probability model and there is a background model for which the same outcome has probabilityπ, then the log ratio log(p/π) is theprobability gain, and its expected value theinformation gain, for that outcome. Such concepts are closely related to the likelihood of the model and its entropy rate. The relationships between these concepts are explored in the case that the outcomes in question are the occurrence or nonoccurrence of events in a stochastic point process. It is shown that, in such a context, the mean information gain per unit time, based on forecasts made at arbitrary discrete time intervals, is bounded above by the entropy rate of the point process. Two examples illustrate how the information gain may be related to realizations with a range of values of ‘predictability'.
Publisher
Cambridge University Press (CUP)
Subject
Statistics, Probability and Uncertainty,General Mathematics,Statistics and Probability
Cited by
4 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献