1. Reducing Energy in GPGPUs through Approximate Trivial Bypassing
2. Optimizing transformers with approximate computing for faster, smaller and more accurate nlp models;nagarajan,2020
3. Nvit: Vision transformer compression and parameter redistribution;yang,2021
4. AxR-NN: Approximate Computation Reuse for Energy-Efficient Convolutional Neural Networks
5. Ieee standard 754 for binary floating-point arithmetic;kahan;Lecture Notes on the Status of IEEE,1996