Abstract
Least absolute deviation is proposed as a robust estimator to solve the problem when the error has an asymmetric heavy-tailed distribution or outliers. In order to be insensitive to the above situation and select the truly important variables from a large number of predictors in the linear regression, this paper introduces a two-stage variable selection method named relaxed lad lasso, which enables the model to obtain robust sparse solutions in the presence of outliers or heavy-tailed errors by combining least absolute deviation with relaxed lasso. Compared with lasso, this method is not only immune to the rapid growth of noise variables but also maintains a better convergence rate, which is Opn−1/2. In addition, we prove that the relaxed lad lasso estimator has the property of consistency at large samples; that is, the model selects the number of important variables with a high probability of convergence to one. Through the simulation and empirical results, we further verify the outstanding performance of relaxed lad lasso in terms of prediction accuracy and the correct selection of informative variables under the heavy-tailed distribution.
Subject
Physics and Astronomy (miscellaneous),General Mathematics,Chemistry (miscellaneous),Computer Science (miscellaneous)
Reference25 articles.
1. Regression Shrinkage and Selection Via the Lasso
2. A selective review of robust variable selection with applications in bioinformatics
3. Weighted Lasso Subsampling for HighDimensional Regression;Uraibi;Electron. J. Appl. Stat. Anal.,2019
4. Relaxed Lasso
5. Extended comparisons of best subset selection, forward stepwise selection, and the lasso;Hastie;arXiv,2017
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献