Affiliation:
1. Department of Precision Instrument, Center for Brain Inspired Computing Research Tsinghua University Beijing 100084 P. R. China
2. Forschungszentrum Jülich, Institute of Electrochemistry and Energy System WilhelmJohnen‐Straße 52426 Jülich Germany
3. “Acad. Evgeni Budevski” IEE‐BAS Bulgarian Academy of Sciences (BAS) Acad. G. Bonchev Str, Block 10 Sofia 1113 Bulgaria
4. Chinese Institute for Brain Research Beijing 102206 P. R. China
Abstract
AbstractSelf‐attention mechanism is critically central to the state‐of‐the‐art transformer models. Because the standard full self‐attention has quadratic complexity with respect to the input's length L, resulting in prohibitively large memory for very long sequences, sparse self‐attention enabled by random projection (RP)‐based locality‐sensitive hashing (LSH) has recently been proposed to reduce the complexity to O(L log L). However, in current digital computing hardware with a von Neumann architecture, RP, which is essentially a matrix multiplication operation, incurs unavoidable time and energy‐consuming data shuttling between off‐chip memory and processing units. In addition, it is known that digital computers simply cannot generate provably random numbers. With the emerging analog memristive technology, it is shown that it is feasible to harness the intrinsic device‐to‐device variability in the memristor crossbar array for implementing the RP matrix and perform RP‐LSH computation in memory. On this basis, sequence prediction tasks are performed with a sparse self‐attention‐based Transformer in a hybrid software‐hardware approach, achieving a testing accuracy over 70% with much less computational complexity. By further harnessing the cycle‐to‐cycle variability for multi‐round hashing, 12% increase in the testing accuracy is demonstrated. This work extends the range of applications of memristor crossbar arrays to the state‐of‐the‐art large language models (LLMs).
Funder
National Natural Science Foundation of China
Key Technologies Research and Development Program
CAST Innovation Foundation