Affiliation:
1. Faculty of Mathematics and Statistics, Hubei Key Laboratory of Applied Mathematics, Hubei University, Wuhan 430062, P. R. China
Abstract
In the last few years, many known works in learning theory stepped over the classical assumption that samples are independent and identical distribution and investigated learning performance based on non-independent samples, as mixing sequences (e.g., [Formula: see text]-mixing, [Formula: see text]-mixing, [Formula: see text]-mixing etc.), they derived similar results with the investigation based on classical sample assumption. Negative association (NA) sequence is a kind of significant dependent random variables and plays an important role in non-independent sequences. It is widely applied to various subjects such as probability theory, statistics and stochastic processes. Therefore, it is essential to study the learning performance of learning process for dependent samples drawn from NA process. Obviously, samples in this learning process are not independent and identical distribution. The results in classical learning theory are not applied directly. In this paper, we study the consistency of least-square regularized regression with NA samples. We establish the error bound of least-square regularized regression for NA samples, and prove that the learning rate of least-square regularized regression for NA samples is [Formula: see text], which is tend to [Formula: see text] when [Formula: see text] arbitrarily close to 0, where [Formula: see text] denote the number of the samples. The simulation experiment of convergence rate on NA samples reveals that the least-square regularized regression algorithm for NA samples is consistent. This result generalizes the classical result of independent and identical distribution.
Funder
National Natural Science Foundation of China
Publisher
World Scientific Pub Co Pte Lt
Subject
Applied Mathematics,Information Systems,Signal Processing
Cited by
4 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献
1. Regression Analysis of Stochastic Fatigue Crack Growth Model in a Martingale Difference Framework;Journal of Statistical Theory and Practice;2020-06-16
2. Randomized approximation numbers on Besov classes with mixed smoothness;International Journal of Wavelets, Multiresolution and Information Processing;2020-03-12
3. Randomized multi-scale kernels learning with sparsity constraint regularization for regression;International Journal of Wavelets, Multiresolution and Information Processing;2019-11
4. Convergence rate of SVM for kernel-based robust regression;International Journal of Wavelets, Multiresolution and Information Processing;2019-01