Abstract
Abstract
We consider a simple classification problem to show that the dynamics of finite–width Deep Neural Networks in the underparametrized regime gives rise to effects similar to those associated with glassy systems, namely a slow evolution of the loss function and aging. Remarkably, the aging is sublinear in the waiting time (subaging) and the power–law exponent characterizing it is robust to different architectures under the constraint of a constant total number of parameters. Our results are maintained in the more complex scenario of the MNIST database. We find that for this database there is a unique exponent ruling the subaging behavior in the whole phase.
Subject
Artificial Intelligence,Human-Computer Interaction,Software
Reference42 articles.
1. Jamming: a simple introduction;Alexander;Physica A,2010
2. The jamming transition;Altieri,2019
3. Glasses and aging: a statistical mechanics perspective;Arceri,2020
4. Statistical mechanics of deep learning;Bahri;Annu. Rev. Condens. Matter Phys.,2020
5. Comparing dynamics: deep neural networks versus glassy systems;Baity-Jesi;J. Stat. Mech.,2019