Affiliation:
1. Arizona State University, Tempe, Arizona
Abstract
The next significant step in the evolution and proliferation of artificial intelligence technology will be the integration of neural network (NN) models within embedded and mobile systems. This calls for the design of compact, energy efficient NN models in silicon. In this article, we present a scalable application-specific integrated circuit (ASIC) design of an energy-efficient Long Short-Term Memory (<underline>LS</underline>TM) <underline>a</underline>ccelerator, named ELSA, which is suitable for energy-constrained devices. It includes several architectural innovations to achieve small area and high energy efficiency. To reduce the area and power consumption of the overall design, the compute-intensive units of ELSA employ approximate multiplications and still achieve high performance and accuracy. The performance is further improved through efficient synchronization of the elastic pipeline stages to maximize the utilization. The article also includes a performance model of ELSA, as a function of the hidden nodes and timesteps, permitting its use for the evaluation of any LSTM application. ELSA was implemented in register transfer level (RTL) and was synthesized and placed and routed in 65nm technology. Its functionality is demonstrated for language modeling—a common application of LSTM. ELSA is compared against a baseline implementation of an LSTM accelerator with standard functional units and without any of the architectural innovations of ELSA. The article demonstrates that ELSA can achieve significant improvements in power, area, and energy-efficiency when compared to the baseline design and several ASIC implementations reported in the literature, making it suitable for use in embedded systems and real-time applications.
Funder
NSF I/UCRC Center for Embedded Systems and NSF
Publisher
Association for Computing Machinery (ACM)
Subject
Hardware and Architecture,Software
Cited by
16 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献