Abstract
AbstractLarge language models are widely used across various applications owing to their superior performance. However, their high computational cost makes deployment on edge devices challenging. Spiking neural networks (SNNs), with their power-efficient, event-driven binary operations, offer a promising alternative. Combining SNNs and transformers is expected to be an effective solution for edge computing. This study proposes an energy-efficient spike transformer accelerator, which is the base component of the large language models, for edge computing, combining the efficiency of SNNs with the performance of transformer models. The design achieves performance levels comparable to traditional transformers while maintaining the lower power consumption characteristic of SNNs. To enhance hardware efficiency, a specialized computation engine and novel datapath for the spike transformer are introduced. The proposed design is implemented on the Xilinx Zynq UltraScale+ ZCU102 device, demonstrating significant improvements in energy consumption over previous transformer accelerators. It even surpasses some recent binary transformer accelerators in efficiency. Implementation results confirm that the proposed spike transformer accelerator is a feasible solution for running transformer models on edge devices.
Funder
Natural Science Foundation of Shandong Province
Natural Science Foundation of Qingdao Municipality
Department of Science and Technology of Shandong Province
Publisher
Springer Science and Business Media LLC
Reference36 articles.
1. Bai HL, Zhang W, Hou L, Shane LF, Jin J, Jiang X et al (2021) BinaryBERT: pushing the limit of BERT quantization. Preprint at arXiv:2012.15701
2. Bi Z, Zhang NY, Xue YD, Ou YX, Ji DX, Zheng GZ et al (2024) OceanGPT: a large language model for ocean science tasks. Preprint at arXiv:2310.02031
3. Brown TB, Mann B, Ryder N, Subbiah M, Kaplan J, Dhariwal P et al (2020) Language models are few-shot learners. Preprint at arXiv:2005.14165
4. Chen QC, Cai CD, Chen YR, Zhou X, Zhang D, Peng Y (2024) TemproNet: a transformer-based deep learning model for seawater temperature prediction. Ocean Eng 293:116651
5. Chen TL, Cheng Y, Gan Z, Yuan L, Zhang L, Wang ZY (2021) Chasing sparsity in vision transformers: an end-to-end exploration. Preprint at arXiv:2106.04533