The consistency of least-square regularized regression with negative association sequence

Author:

Chen Fen1,Zou Bin1,Chen Na1

Affiliation:

1. Faculty of Mathematics and Statistics, Hubei Key Laboratory of Applied Mathematics, Hubei University, Wuhan 430062, P. R. China

Abstract

In the last few years, many known works in learning theory stepped over the classical assumption that samples are independent and identical distribution and investigated learning performance based on non-independent samples, as mixing sequences (e.g., [Formula: see text]-mixing, [Formula: see text]-mixing, [Formula: see text]-mixing etc.), they derived similar results with the investigation based on classical sample assumption. Negative association (NA) sequence is a kind of significant dependent random variables and plays an important role in non-independent sequences. It is widely applied to various subjects such as probability theory, statistics and stochastic processes. Therefore, it is essential to study the learning performance of learning process for dependent samples drawn from NA process. Obviously, samples in this learning process are not independent and identical distribution. The results in classical learning theory are not applied directly. In this paper, we study the consistency of least-square regularized regression with NA samples. We establish the error bound of least-square regularized regression for NA samples, and prove that the learning rate of least-square regularized regression for NA samples is [Formula: see text], which is tend to [Formula: see text] when [Formula: see text] arbitrarily close to 0, where [Formula: see text] denote the number of the samples. The simulation experiment of convergence rate on NA samples reveals that the least-square regularized regression algorithm for NA samples is consistent. This result generalizes the classical result of independent and identical distribution.

Funder

National Natural Science Foundation of China

Publisher

World Scientific Pub Co Pte Lt

Subject

Applied Mathematics,Information Systems,Signal Processing

Cited by 4 articles. 订阅此论文施引文献 订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献

1. Regression Analysis of Stochastic Fatigue Crack Growth Model in a Martingale Difference Framework;Journal of Statistical Theory and Practice;2020-06-16

2. Randomized approximation numbers on Besov classes with mixed smoothness;International Journal of Wavelets, Multiresolution and Information Processing;2020-03-12

3. Randomized multi-scale kernels learning with sparsity constraint regularization for regression;International Journal of Wavelets, Multiresolution and Information Processing;2019-11

4. Convergence rate of SVM for kernel-based robust regression;International Journal of Wavelets, Multiresolution and Information Processing;2019-01

同舟云学术

1.学者识别学者识别

2.学术分析学术分析

3.人才评估人才评估

"同舟云学术"是以全球学者为主线,采集、加工和组织学术论文而形成的新型学术文献查询和分析系统,可以对全球学者进行文献检索和人才价值评估。用户可以通过关注某些学科领域的顶尖人物而持续追踪该领域的学科进展和研究前沿。经过近期的数据扩容,当前同舟云学术共收录了国内外主流学术期刊6万余种,收集的期刊论文及会议论文总量共计约1.5亿篇,并以每天添加12000余篇中外论文的速度递增。我们也可以为用户提供个性化、定制化的学者数据。欢迎来电咨询!咨询电话:010-8811{复制后删除}0370

www.globalauthorid.com

TOP

Copyright © 2019-2024 北京同舟云网络信息技术有限公司
京公网安备11010802033243号  京ICP备18003416号-3