Author:
Kereta Zeljko, ,Naumova Valeriya
Abstract
<abstract><p>Despite recent advances in regularization theory, the issue of parameter selection still remains a challenge for most applications. In a recent work the framework of statistical learning was used to approximate the optimal Tikhonov regularization parameter from noisy data. In this work, we improve their results and extend the analysis to the elastic net regularization. Furthermore, we design a data-driven, automated algorithm for the computation of an approximate regularization parameter. Our analysis combines statistical learning theory with insights from regularization theory. We compare our approach with state-of-the-art parameter selection criteria and show that it has superior accuracy.</p></abstract>
Publisher
American Institute of Mathematical Sciences (AIMS)
Subject
Applied Mathematics,Mathematical Physics,Analysis
Reference37 articles.
1. S. W. Anzengruber, R. Ramlau, Morozov's discrepancy principle for Tikhonov-type functionals with nonlinear operators, Inverse Probl., 26 (2010), 025001.
2. A. Astolfi, Optimization: An introduction, 2006.
3. F. Bauer, M. A. Lukas, Comparing parameter choice methods for regularization of ill-posed problems, Math. Comput. Simulat., 81 (2011), 1795–1841.
4. M. Belkin, P. Niyogi, V. Sindhwani, Manifold regularization: A geometric framework for learning from labeled and unlabeled examples, J. Mach. Learn. Res., 7 (2006), 2399–2434.
5. R. Bhatia, Matrix analysis, New York: Springer-Verlag, 1997.