Abstract
Finite-sample bounds on the accuracy of Bhattacharya’s plug-in estimator for Fisher information are derived. These bounds are further improved by introducing a clipping step that allows for better control over the score function. This leads to superior upper bounds on the rates of convergence, albeit under slightly different regularity conditions. The performance bounds on both estimators are evaluated for the practically relevant case of a random variable contaminated by Gaussian noise. Moreover, using Brown’s identity, two corresponding estimators of the minimum mean-square error are proposed.
Funder
National Science Foundation
Deutsche Stiftung Friedensforschung
Subject
General Physics and Astronomy
Reference24 articles.
1. Estimation of a probability density function and its derivatives;Bhattacharya;Sankhyā: Indian J. Stat. Ser. A,1967
2. On the Estimation of Functionals of the Probability Density and Its Derivatives
3. Estimation of a Probability Density Function and Its Derivatives
4. Consistency of estimators for multivariate density functions and for the mode;Rüschendorf;Sankhyā: Indian J. Stat. Ser. A,1977
5. Weak and Strong Uniform Consistency of the Kernel Estimate of a Density and its Derivatives
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献
1. Measuring Information from Moments;IEEE Transactions on Information Theory;2022