Affiliation:
1. University of Florida, FL, USA
Abstract
Loop caches provide an effective method for decreasing memory hierarchy energy consumption by storing frequently executed code (critical regions) in a more energy efficient structure than the level one cache. However, due to code structure restrictions or costly design time pre-analysis efforts, previous loop cache designs are not suitable for all applications and system scenarios. We present an adaptive loop cache that is amenable to a wider range of system scenarios, which can provide an additional 20% average instruction cache energy savings (with individual benchmark energy savings as high as 69%) compared to the next best loop cache, the preloaded loop cache.
Funder
Division of Computer and Network Systems
Publisher
Association for Computing Machinery (ACM)
Subject
Hardware and Architecture,Software
Reference27 articles.
1. A static power model for architects
2. Energy-aware fetch mechanism
3. EEMBC. http://www.eembc.org/. EEMBC. http://www.eembc.org/.
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献