Abstract
AbstractWe derive non-asymptotic concentration inequalities for the uniform deviation between a multivariate density function and its non-parametric kernel density estimator in stationary and uniform mixing time series framework. We derive analogous inequalities for their (first) Wasserstein distance, as well as for the deviations between integrals of bounded functions w.r.t. them. They can be used for the construction of confidence regions, the estimation of the finite sample probabilities of decision errors, etc. We employ the concentration results to the derivation of statistical guarantees and oracle inequalities in regularized prediction problems with Lipschitz and strongly convex costs.
Funder
Athens University of Economics & Business
Publisher
Springer Science and Business Media LLC
Subject
Statistics and Probability
Reference14 articles.
1. Ahsen, M. E. & Vidyasagar, M. (2013). On the computation of mixing coefficients between discrete-valued random variables. 9th Asian Control Conference (ASCC), pp. 1–5.
2. Davidson, J. (1994). Stochastic limit theory: An introduction for econometricians. Oxford: OUP Oxford.
3. Doukhan, P., & Ghindès, M. (1983). Estimation de la transition de probabilité d’une chaîne de Markov Doëblin-récurrente, Étude du cas du processus autorégressif général d’ordre 1,. Stochastic Processes and their Applications, 15(3), 271–293.
4. El Machkouri, M., Fan, X., & Reding, L. (2020). On the Nadaraya-Watson kernel regression estimator for irregularly spaced spatial data. Journal of Statistical Planning and Inference, 205, 92–114.
5. Gao, R., Chen, X., & Kleywegt, A.J. (2017). Distributional robustness and regularization in statistical learning. arXiv preprint arXiv:1712.06050.
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献