Affiliation:
1. University of California San Diego, USA
2. Massachusetts Institute of Technology, USA
3. AMD, USA
Abstract
Deep neural networks use skip connections to improve training convergence. However, these skip connections are costly in hardware, requiring extra buffers and increasing on- and off-chip memory utilization and bandwidth requirements. In this article, we show that skip connections can be optimized for hardware when tackled with a hardware-software codesign approach. We argue that while a network’s skip connections are needed for the network to learn, they can later be removed or shortened to provide a more hardware-efficient implementation with minimal to no accuracy loss. We introduce
Tailor
, a codesign tool whose hardware-aware training algorithm gradually removes or shortens a fully trained network’s skip connections to lower the hardware cost.
Tailor
improves resource utilization by up to 34% for block random access memories (BRAMs), 13% for flip-flops (FFs), and 16% for look-up tables (LUTs) for on-chip, dataflow-style architectures.
Tailor
increases performance by 30% and reduces memory bandwidth by 45% for a two-dimensional processing element array architecture.
Funder
National Science Foundation Graduate Research Fellowship Program
National Science Foundation
U.S. Department of Energy
Office of Science, Office of Advanced Scientific Computing Research
DOE Office of Science, Office of High Energy Physics Early Career Research Program
Publisher
Association for Computing Machinery (ACM)
Reference52 articles.
1. 2023. Tailor. https://github.com/oliviaweng/tailor
2. Fast convolutional neural networks on FPGAs with hls4ml
3. MLPerf tiny benchmark;Banbury Colby;arXiv preprint arXiv:2106.07597,2021
4. Learning long-term dependencies with gradient descent is difficult
5. Hendrik Borras et al. 2022. Open-source FPGA-ML codesign for the MLPerf tiny benchmark. In Workshop on Benchmarking Machine Learning Workloads on Emerging Hardware (MLBench). arXiv:2206.11791 [cs.LG].
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献