Abstract
The Cramér–Rao inequality for the variance of an unbiassed estimator is first recalled, and the value of the ‘ideal' equation of estimation existing if the minimum bound is reached illustrated by examples. When this estimation equation is not available, the more general inequality due to Kiefer is more relevant. The analogous estimation equation corresponding to the attaining of the more general minimum bound is again illustrated by examples; and the general theory, which is also available for more than one parameter, is indicated for any location and/or scaling problem.
Publisher
Cambridge University Press (CUP)
Subject
Statistics, Probability and Uncertainty,General Mathematics,Statistics and Probability
Cited by
3 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献
1. Bibliography;Hilbert Space Methods in Probability and Statistical Inference;2011-09-13
2. An information theoretic argument for the validity of the exponential model;Metrika;1994-12
3. EXAMPLES OF MINIMUM VARIANCE ESTIMATION1;Australian Journal of Statistics;1983-02