Abstract
AbstractThe cerebellar granule cell layer has inspired numerous theoretical models of neural representations that support learned behaviors, beginning with the work of Marr and Albus. In these models, granule cells form a sparse, combinatorial encoding of diverse sensorimotor inputs. Such sparse representations are optimal for learning to discriminate random stimuli. However, recent observations of dense, low-dimensional activity across granule cells have called into question the role of sparse coding in these neurons. Here, we generalize theories of cerebellar learning to determine the optimal granule cell representation for tasks beyond random stimulus discrimination, including continuous input-output transformations as required for smooth motor control. We show that for such tasks, the optimal granule cell representation is substantially denser than predicted by classic theories. Our results provide a general theory of learning in cerebellum-like systems and suggest that optimal cerebellar representations are task-dependent.
Publisher
Cold Spring Harbor Laboratory
Reference70 articles.
1. Abbott, L.F. , Rajan, K. , and Sompolinsky, H. , Interactions between intrinsic and stimulus-evoked activity in recurrent neural networks. In The Dynamic Brain: An Exploration of Neuronal Variability and its Functional Significance, 65–82 (Oxford University Press, 2011).
2. A theory of cerebellar function
3. Andersen, M. , Dahl, J. , Liu, Z. , and Vanderberghe, L. , Interior-point methods for large-scale cone programming. In Optimization for Machine Learning (MIT Press, 2011).
4. Sparseness and Expansion in Sensory Representations
5. Breaking the curse of dimensionality with convex neural networks;Journal of Machine Learning Research,2017
Cited by
2 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献