Affiliation:
1. Computer Science and Engineering Department, VNR Vignana Jyothi Institute of Engineering and Technology, Hyderabad, India
2. Computer Science and Engineering–Data Science, Cyber Security and Artificial Intelligence and Data Science Department, VNR Vignana Jyothi Institute of Engineering and Technology, Hyderabad, India
3. Department of Information Technology, VNR Vignana Jyothi Institute of Engineering and Technology, Hyderabad, India
Abstract
This study presents a novel approach to optimize resource allocation, aiming to boost the efficiency of content distribution in Internet of Things (IoT) edge cloud computing environments. The proposed method termed the Caching-based Deep Q-Network (CbDQN) framework, dynamically allocates computational and storage resources across edge devices and cloud servers. Despite its need for increased storage capacity, the high cost of edge computing, and the inherent limitations of wireless networks connecting edge devices, the CbDQN strategy addresses these challenges. By considering constraints such as limited bandwidth and potential latency issues, it ensures efficient data transfer without compromising performance. The method focuses on mitigating inefficient resource usage, particularly crucial in cloud-based edge computing environments where resource costs are usage-based. To overcome these issues, the CbDQN method efficiently distributes limited resources, optimizing efficiency, minimizing costs, and enhancing overall performance. The approach improves content delivery, reduces latency, and minimizes network congestion. The simulation results substantiate the efficacy of the suggested method in optimizing resource utilization and enhancing system performance, showcasing its potential to address challenges associated with content spreading in IoT edge cloud calculating situations. Our proposed approach evaluated metrics achieves high values of Accuracy is 99.85%, Precision at 99.85%, specificity is 99.82%, sensitivity is 99.82%, F-score is 99.82% and AUC is 99.82%.