Abstract
Abstract
This paper considers several aspects of random matrix universality in deep neural networks (DNNs). Motivated by recent experimental work, we use universal properties of random matrices related to local statistics to derive practical implications for DNNs based on a realistic model of their Hessians. In particular we derive universal aspects of outliers in the spectra of deep neural networks and demonstrate the important role of random matrix local laws in popular pre-conditioning gradient descent algorithms. We also present insights into DNN loss surfaces from quite general arguments based on tools from statistical physics and random matrix theory.
Funder
H2020 European Research Council
Subject
General Physics and Astronomy,Mathematical Physics,Modeling and Simulation,Statistics and Probability,Statistical and Nonlinear Physics
Reference69 articles.
1. Random matrices and complexity of spin glasses;Auffinger;Commun. Pure Appl. Math.,2013
2. Exponential growth of random determinants beyond invariance;Arous,2021
3. Phase transition of the largest eigenvalue for nonnull complex sample covariance matrices;Baik;Ann. Probab.,2005