Abstract
Abstract
We consider statistical linear inverse problems in separable Hilbert spaces and filter-based reconstruction methods of the form
f
ˆ
α
=
q
α
T
∗
T
T
∗
Y
, where Y is the available data, T the forward operator,
q
α
α
∈
A
an ordered filter, and α > 0 a regularization parameter. Whenever such a method is used in practice, α has to be appropriately chosen. Typically, the aim is to find or at least approximate the best possible α in the sense that mean squared error (MSE)
E
[
∥
f
ˆ
α
−
f
†
∥
2
]
w.r.t. the true solution
f
†
is minimized. In this paper, we introduce the Sharp Optimal Lepskiĭ-Inspired Tuning (SOLIT) method, which yields an a posteriori parameter choice rule ensuring adaptive minimax rates of convergence. It depends only on Y and the noise level σ as well as the operator T and the filter
q
α
α
∈
A
and does not require any problem-dependent tuning of further parameters. We prove an oracle inequality for the corresponding MSE in a general setting and derive the rates of convergence in different scenarios. By a careful analysis we show that no other a posteriori parameter choice rule can yield a better performance in terms of the order of the convergence rate of the MSE. In particular, our results reveal that the typical understanding of Lepskiĭ-type methods in inverse problems leading to a loss of a log factor is wrong. In addition, the empirical performance of SOLIT is examined in simulations.
Funder
Deutsche Forschungsgemeinschaft
Subject
Applied Mathematics,Computer Science Applications,Mathematical Physics,Signal Processing,Theoretical Computer Science
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献