Affiliation:
1. Department of Statistics, University of Chicago, Chicago IL, USA
Abstract
Abstract
Many problems in high-dimensional statistics and optimization involve minimization over non-convex constraints—for instance, a rank constraint for a matrix estimation problem—but little is known about the theoretical properties of such optimization problems for a general non-convex constraint set. In this paper we study the interplay between the geometric properties of the constraint set and the convergence behavior of gradient descent for minimization over this set. We develop the notion of local concavity coefficients of the constraint set, measuring the extent to which convexity is violated, which governs the behavior of projected gradient descent over this set. We demonstrate the versatility of these concavity coefficients by computing them for a range of problems in low-rank estimation, sparse estimation and other examples. Through our understanding of the role of these geometric properties in optimization, we then provide a convergence analysis when projections are calculated only approximately, leading to a more efficient method for projected gradient descent in low-rank estimation problems.
Funder
Alfred P. Sloan Foundation
Division of Mathematical Sciences
Publisher
Oxford University Press (OUP)
Subject
Applied Mathematics,Computational Theory and Mathematics,Numerical Analysis,Statistics and Probability,Analysis
Reference35 articles.
1. Characterizations of prox-regular sets in uniformly convex Banach spaces;Bernard;J. Convex Anal.,2006
2. Convex optimization: algorithms and complexity;Bubeck;Foundations and Trends Ⓡ; in Mach. Learn.,2015
3. Exact matrix completion via convex optimization;Candès;Commun. ACM,2012
4. Phase retrieval via Wirtinger flow: theory and algorithms;Candès;IEEE Trans. Info. Theory,2015
5. Enhancing Sparsity by Reweighted ℓ 1 Minimization
Cited by
10 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献