Abstract
AbstractThis study proposes sparse estimation methods for the generalized linear models, which run one of least angle regression (LARS) and least absolute shrinkage and selection operator (LASSO) in the tangent space of the manifold of the statistical model. This study approximates the statistical model and subsequently uses exact calculations. LARS was proposed as an efficient algorithm for parameter estimation and variable selection for the normal linear model. The LARS algorithm is described in terms of Euclidean geometry regarding the correlation as the metric of the parameter space. Since the LARS algorithm only works in Euclidean space, we transform a manifold of the statistical model into the tangent space at the origin. In the generalized linear regression, this transformation allows us to run the original LARS algorithm for the generalized linear models. The proposed methods are efficient and perform well. Real-data analysis indicates that the proposed methods output similar results to that of the $$l_1$$
l
1
-regularized maximum likelihood estimation for the aforementioned models. Numerical experiments reveal that our methods work well and they may be better than the $$l_1$$
l
1
-regularization in generalization, parameter estimation, and model selection.
Funder
Japan Society for the Promotion of Science
Meiji University
Publisher
Springer Science and Business Media LLC
Reference18 articles.
1. Amari S (1985) Differential-geometrical methods in statistics. Springer, Berlin
2. Amari S (2016) Information geometry and its applications. Springer, Berlin
3. Amari S, Nagaoka H (2000) Methods of information geometry. Oxford University Press, Oxford
4. Augugliaro L, Mineo AM, Wit EC (2013) dgLARS: A differential geometric approach to sparse generalized linear models. J R Stat Soc Series B 75:471–498
5. Ay N, Jost J, Le HV, Schwachhöfer L (2017) Information geometry. Springer, Berlin