Abstract
AbstractIn this paper we present an adaptive synaptic array that can be used to improve the energy-efficiency of training machine learning (ML) systems. The synaptic array comprises of an ensemble of analog memory elements, each of which is a micro-scale dynamical system in its own right, storing information in its temporal state trajectory. The state trajectories are then modulated by a system level learning algorithm such that the ensemble trajectory is guided towards the optimal solution. We show that the extrinsic energy required for state trajectory modulation can be matched to the dynamics of neural network learning which leads to a significant reduction in energy-dissipated for memory updates during ML training. Thus, the proposed synapse array could have significant implications in addressing the energy-efficiency imbalance between the training and the inference phases observed in artificial intelligence (AI) systems.
Funder
U.S. Department of Health & Human Services | NIH | National Eye Institute
United States Department of Defense | United States Navy | Office of Naval Research
NSF | Directorate for Engineering
Publisher
Springer Science and Business Media LLC
Subject
General Physics and Astronomy,General Biochemistry, Genetics and Molecular Biology,General Chemistry,Multidisciplinary
Cited by
7 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献