Abstract
Multilayer neural networks are among the most powerful models in machine learning, yet the fundamental reasons for this success defy mathematical understanding. Learning a neural network requires optimizing a nonconvex high-dimensional objective (risk function), a problem that is usually attacked using stochastic gradient descent (SGD). Does SGD converge to a global optimum of the risk or only to a local optimum? In the former case, does this happen because local minima are absent or because SGD somehow avoids them? In the latter, why do local minima reached by SGD have good generalization properties? In this paper, we consider a simple case, namely two-layer neural networks, and prove that—in a suitable scaling limit—SGD dynamics is captured by a certain nonlinear partial differential equation (PDE) that we call distributional dynamics (DD). We then consider several specific examples and show how DD can be used to prove convergence of SGD to networks with nearly ideal generalization error. This description allows for “averaging out” some of the complexities of the landscape of neural networks and can be used to prove a general convergence result for noisy SGD.
Funder
NSF | CISE | Division of Computing and Communication Foundations
NSF | Directorate for Mathematical and Physical Sciences
National Science Foundation
Publisher
Proceedings of the National Academy of Sciences
Reference27 articles.
1. Rosenblatt F (1962) Principles of Neurodynamics (Spartan Book, Washington, DC).
2. Krizhevsky A Sutskever I Hinton GE (2012) Imagenet classification with deep convolutional neural networks. Advances in Neural Information Processing Systems, ed Vardi MY (Association for Computing Machinery, New York), pp 1097–1105.
3. Goodfellow I Bengio Y Courville A Bengio Y (2016) Deep Learning (MIT Press, Cambridge), Vol 1.
4. A stochastic approximation method;Robbins;Ann Math Stat.,1951
5. Bottou L (2010) Large-scale machine learning with stochastic gradient descent. Proceedings of COMPSTAT’2010, eds Lechevallier Y Saporta G (Physica, Heidelberg), pp 177–186.
Cited by
220 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献