Abstract
Abstract
We study the random binary symmetric perceptron problem, focusing on the behavior of rare high-margin solutions. While most solutions are isolated, we demonstrate that these rare solutions are part of clusters of extensive entropy, heuristically corresponding to non-trivial fixed points of an approximate message-passing algorithm. We enumerate these clusters via a local entropy, defined as a Franz–Parisi potential, which we rigorously evaluate using the first and second moment methods in the limit of a small constraint density
α
(corresponding to vanishing margin
κ
) under a certain assumption on the concentration of the entropy. This examination unveils several intriguing phenomena: (i) we demonstrate that these clusters have an entropic barrier in the sense that the entropy as a function of the distance from the reference high-margin solution is non-monotone when
κ
⩽
1.429
−
α
/
log
α
, while it is monotone otherwise, and that they have an energetic barrier in the sense that there are no solutions at an intermediate distance from the reference solution when
κ
⩽
1.239
−
α
/
log
α
. The critical scaling of the margin
κ
in
−
α
/
log
α
corresponds to the one obtained from the earlier work of Gamarnik et al (2022 (arXiv:2203.15667)) for the overlap-gap property, a phenomenon known to present a barrier to certain efficient algorithms. (ii) We establish using the replica method that the complexity (the logarithm of the number of clusters of such solutions) versus entropy (the logarithm of the number of solutions in the clusters) curves are partly non-concave and correspond to very large values of the Parisi parameter, with the equilibrium being reached when the Parisi parameter diverges.
Funder
Swiss National Science Foundation grant OperaGOST
Swiss National Science Foundation grant SMArtNet
Reference40 articles.
1. Proof of the contiguity conjecture and lognormal limit for the symmetric perceptron;Abbe,2022
2. Binary perceptron: efficient algorithms can find solutions in a rare well-connected cluster;Abbe,2022
3. Algorithmic barriers from phase transitions;Achlioptas,2008
4. Storage capacity in symmetric binary perceptrons;Aubin;J. Phys. A: Math. Theor.,2019
5. Unreasonable effectiveness of learning neural networks: from accessible states and robust ensembles to basic algorithmic schemes;Baldassi;Proc. Natl Acad. Sci.,2016