Affiliation:
1. Department of Electrical and Computer Engineering, National University of Singapore , Singapore 117576, Singapore
Abstract
Nonlinear activation functions play a crucial role in artificial neural networks. However, digital implementations of sigmoidal functions, the commonly used activation functions, are facing challenges related to energy consumption and area requirements. To address these issues, we develop a proof-of-concept computing system that utilizes magnetic tunnel junctions as the key element for implementing sigmoidal activation functions. Using this system, we train a neural network for speech separation. When compared to state-of-the-art digital implementations, our scalable circuit has the potential to consume up to 383 times less energy and occupy 7354 times smaller area. These results pave the way for more efficient computing systems in the future.
Funder
Advanced Research and Technology Innovation Centre, College of Design and Engineering, National University of Singapore
National Research Foundation Singapore
Ministry of Education Singapore
Reference57 articles.
1. D.
Patterson
,
J.Gonzalez,
Q.Le,
C.Liang,
L.-M.Munguia,
D.Rothchild,
D.So,
M.Texier, and
J.Dean, “
Carbon emissions and large neural network training,” arXiv:2104.10350 (2021).
2. In-memory computing with resistive switching devices;Nat. Electron.,2018
3. Memristive crossbar arrays for brain-inspired computing;Nat. Mater.,2019
4. Physics for neuromorphic computing;Nat. Rev. Phys.,2020
5. A novel approximation methodology and its efficient VLSI implementation for the sigmoid function;IEEE Trans. Circuits Syst. II,2020