Author:
Asadi Majid,Ebrahimi Nader,Hamedani G. G.,Soofi Ehsan S.
Abstract
In this paper, we introduce the minimum dynamic discrimination information (MDDI) approach to probability modeling. The MDDI model relative to a given distribution G is that which has least Kullback-Leibler information discrepancy relative to G, among all distributions satisfying some information constraints given in terms of residual moment inequalities, residual moment growth inequalities, or hazard rate growth inequalities. Our results lead to MDDI characterizations of many well-known lifetime models and to the development of some new models. Dynamic information constraints that characterize these models are tabulated. A result for characterizing distributions based on dynamic Rényi information divergence is also given.
Publisher
Cambridge University Press (CUP)
Subject
Statistics, Probability and Uncertainty,General Mathematics,Statistics and Probability
Reference16 articles.
1. Hamedani G. G. (2005). Characterizations of univariate continuous distributions based on hazard functions. To appear in J. Appl. Statist. Sci.
2. Maximum dynamic entropy models
3. Some results on residual entropy function
4. How to measure uncertainty in the residual lifetime distributions;Ebrahimi;Sankhyā A,1996
Cited by
20 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献