Abstract
Caching technique is a promising approach to reduce the heavy traffic load and improve user latency experience for the Internet of Things (IoT). In this paper, by exploiting edge cache resources and communication opportunities in device-to-device (D2D) networks and broadcast networks, two novel coded caching schemes are proposed that greatly reduce transmission latency for the centralized and decentralized caching settings, respectively. In addition to the multicast gain, both schemes obtain an additional cooperation gain offered by user cooperation and an additional parallel gain offered by the parallel transmission among the server and users. With a newly established lower bound on the transmission delay, we prove that the centralized coded caching scheme is order-optimal, i.e., achieving a constant multiplicative gap within the minimum transmission delay. The decentralized coded caching scheme is also order-optimal if each user’s cache size is larger than a threshold which approaches zero as the total number of users tends to infinity. Moreover, theoretical analysis shows that to reduce the transmission delay, the number of users sending signals simultaneously should be appropriately chosen according to the user’s cache size, and always letting more users send information in parallel could cause high transmission delay.
Funder
National Natural Science Foundation of China
Subject
General Physics and Astronomy
Reference41 articles.
1. Fundamental limits of caching;IEEE Trans. Inf. Theory,2014
2. Decentralized coded caching attains order-optimal memory-rate tradeoff;IEEE/ACM Trans. Netw.,2015
3. Characterizing the Rate-Memory Tradeoff in Cache Networks within a Factor of 2;IEEE Trans. Inf. Theory,2019
4. Wan, K., Tuninetti, D., and Piantanida, P. (2016, January 11–14). On the optimality of uncoded cache placement. Proceedings of the IEEE Information Theory Workshop (ITW), Cambridge, UK.
5. The exact rate-memory tradeoff for caching with uncoded prefetching;IEEE Trans. Inf. Theory,2018
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献