Affiliation:
1. Université Côte d’Azur, Inria, France
2. Università di Verona, Verona, Italy
3. Akamai Technologies, MA, USA
4. Eurecom, France
Abstract
Most of the caching algorithms are oblivious to requests’ timescale, but caching systems are capacity constrained and, in practical cases, the hit rate may be limited by the cache’s impossibility to serve requests fast enough. In particular, the hard-disk access time can be the key factor capping cache performance. In this article, we present a new cache replacement policy that takes advantage of a hierarchical caching architecture, and in particular of access-time difference between memory and disk. Our policy is optimal when requests follow the independent reference model and significantly reduces the hard-disk load, as shown also by our realistic, trace-driven evaluation. Moreover, we show that our policy can be considered in a more general context, since it can be easily adapted to minimize any retrieval cost, as far as costs add over cache misses.
Funder
Italian National Group for Scientific Computation
Bodossaki Foundation, Greece
Publisher
Association for Computing Machinery (ACM)
Subject
Computer Networks and Communications,Hardware and Architecture,Safety, Risk, Reliability and Quality,Media Technology,Information Systems,Software,Computer Science (miscellaneous)
Cited by
22 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献
1. Optimistic online caching for batched requests;Computer Networks;2024-05
2. No- Regret Caching with Noisy Request Estimates;2023 IEEE Virtual Conference on Communications (VCC);2023-11-28
3. No-regret Caching via Online Mirror Descent;ACM Transactions on Modeling and Performance Evaluation of Computing Systems;2023-08-11
4. An overview of analysis methods and evaluation results for caching strategies;Computer Networks;2023-06
5. Optimistic Online Caching for Batched Requests;ICC 2023 - IEEE International Conference on Communications;2023-05-28