Flat minima generalize for low-rank matrix recovery

Author:

Ding Lijun1,Drusvyatskiy Dmitriy2,Fazel Maryam3,Harchaoui Zaid4

Affiliation:

1. Wisconsin Institute for Discovery, University of Wisconsin - Madison , Madison, WI 53795 , USA

2. Department of Mathematics, University of Washington , Seattle, WA 98195 , USA

3. Electrical and Computer Engineering Department, University of Washington , Seattle, WA 98195 , USA

4. Department of Statistics, University of Washington , Seattle, WA 98195 , USA

Abstract

Abstract Empirical evidence suggests that for a variety of overparameterized nonlinear models, most notably in neural network training, the growth of the loss around a minimizer strongly impacts its performance. Flat minima—those around which the loss grows slowly—appear to generalize well. This work takes a step towards understanding this phenomenon by focusing on the simplest class of overparameterized nonlinear models: those arising in low-rank matrix recovery. We analyse overparameterized matrix and bilinear sensing, robust principal component analysis, covariance matrix estimation and single hidden layer neural networks with quadratic activation functions. In all cases, we show that flat minima, measured by the trace of the Hessian, exactly recover the ground truth under standard statistical assumptions. For matrix completion, we establish weak recovery, although empirical evidence suggests exact recovery holds here as well. We complete the paper with synthetic experiments that illustrate our findings.

Publisher

Oxford University Press (OUP)

Reference58 articles.

1. Blind deconvolution using convex programming;Ahmed;IEEE Trans. Inf. Theory,2013

2. The sample complexity of pattern classification with neural networks: the size of the weights is more important than the size of the network;Bartlett;IEEE Trans. Inf. Theory,1998

3. Rademacher and gaussian complexities: risk bounds and structural results;Bartlett;J. Mach. Learn. Res.,2002

4. Reconciling modern machine-learning practice and the classical bias–variance trade-off;Belkin;Proc. Natl. Acad. Sci.,2019

5. ROP: matrix recovery via rank-one projections;Cai;Ann. Stat.,2015

同舟云学术

1.学者识别学者识别

2.学术分析学术分析

3.人才评估人才评估

"同舟云学术"是以全球学者为主线,采集、加工和组织学术论文而形成的新型学术文献查询和分析系统,可以对全球学者进行文献检索和人才价值评估。用户可以通过关注某些学科领域的顶尖人物而持续追踪该领域的学科进展和研究前沿。经过近期的数据扩容,当前同舟云学术共收录了国内外主流学术期刊6万余种,收集的期刊论文及会议论文总量共计约1.5亿篇,并以每天添加12000余篇中外论文的速度递增。我们也可以为用户提供个性化、定制化的学者数据。欢迎来电咨询!咨询电话:010-8811{复制后删除}0370

www.globalauthorid.com

TOP

Copyright © 2019-2024 北京同舟云网络信息技术有限公司
京公网安备11010802033243号  京ICP备18003416号-3