Abstract
AbstractIdentifying appropriate structures for generative or world models is essential for both biological organisms and machines. This work shows that synaptic pruning facilitates efficient statistical structure learning. We extend previously established canonical neural networks to derive a synaptic pruning scheme that is formally equivalent to an online Bayesian model selection. The proposed scheme, termed Bayesian synaptic model pruning (BSyMP), utilizes connectivity parameters to switch between the presence (ON) and absence (OFF) of synaptic connections. Mathematical analyses reveal that these parameters converge to zero for uninformative connections, thus providing reliable and efficient model reduction. This enables the identification of a plausible structure for the environmental model, particularly when the environment is characterized by sparse likelihood and transition matrices. Through causal inference and rule learning simulations, we demonstrate that BSyMP achieves model reduction more efficiently than the conventional Bayesian model reduction scheme. These findings indicate that synaptic pruning could be a neuronal substrate underlying structure learning and generalizability in the brain.
Publisher
Cold Spring Harbor Laboratory