Affiliation:
1. Department of Mathematics, Technische Universität Berlin, Straße des. Juni, Berlin, Germany
2. Department of Mathematics, Rheinisch-Westfälische Technische Hochschule Aachen, Pontdriesch, Aachen, Germany
Abstract
Abstract
This work theoretically studies the problem of estimating a structured high-dimensional signal $\boldsymbol{x}_0 \in{\mathbb{R}}^n$ from noisy $1$-bit Gaussian measurements. Our recovery approach is based on a simple convex program which uses the hinge loss function as data fidelity term. While such a risk minimization strategy is very natural to learn binary output models, such as in classification, its capacity to estimate a specific signal vector is largely unexplored. A major difficulty is that the hinge loss is just piecewise linear, so that its ‘curvature energy’ is concentrated in a single point. This is substantially different from other popular loss functions considered in signal estimation, e.g. the square or logistic loss, which are at least locally strongly convex. It is therefore somewhat unexpected that we can still prove very similar types of recovery guarantees for the hinge loss estimator, even in the presence of strong noise. More specifically, our non-asymptotic error bounds show that stable and robust reconstruction of $\boldsymbol{x}_0$ can be achieved with the optimal oversampling rate $O(m^{-1/2})$ in terms of the number of measurements $m$. Moreover, we permit a wide class of structural assumptions on the ground truth signal, in the sense that $\boldsymbol{x}_0$ can belong to an arbitrary bounded convex set $K \subset{\mathbb{R}}^n$. The proofs of our main results rely on some recent advances in statistical learning theory due to Mendelson. In particular, we invoke an adapted version of Mendelson’s small ball method that allows us to establish a quadratic lower bound on the error of the first-order Taylor approximation of the empirical hinge loss function.
Funder
European Commission
Deutsche Forschungsgemeinschaft
Publisher
Oxford University Press (OUP)
Subject
Applied Mathematics,Computational Theory and Mathematics,Numerical Analysis,Statistics and Probability,Analysis
Reference53 articles.
1. Estimation bounds and sharp oracle inequalities of regularized procedures with Lipschitz loss functions;Alquier,2017
2. Living on the edge: phase transitions in convex programs with random data;Amelunxen;Inf. Inference,2014
3. One-bit compressive sensing of dictionary-sparse signals;Baraniuk;Inf. Inference,2017
4. Local Rademacher complexities;Bartlett;Ann. Statist.,2005
5. Rademacher and Gaussian complexities: risk bounds and structural results;Bartlett;J. Mach. Learn. Res.,2002
Cited by
3 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献