Abstract
Container technology enables rapid deployment of computing services, while edge computing reduces the latency of task computing and improves performance. However, there are limits to the types, number and performance of containers that can be supported by different edge servers, and a sensible task deployment strategy and rapid response to the policy is a must. Therefore, by jointly optimizing the strategies of task deployment, offloading decisions, edge cache and resource allocation, this paper aims to minimize the overall energy consumption of a mobile edge computing (MEC) system composed of multiple mobile devices (MD) and multiple edge servers integrated with different containers. The problem is formalized as a combinatorial optimization problem containing multiple discrete variables when constraints of container type, transmission power, latency, task offloading and deployment strategies are satisfied. To solve the NP-hard problem and achieve fast response for sub-optimal policy, this paper proposes an energy-efficient edge caching and task deployment policy based on Deep Q-Learning (DQCD). Firstly, the pruning and optimization of the exponential action space consisting of offloading decisions, task deployment and caching policy is completed to accelerate the training of the model. Then, the iterative optimization of the training model is completed using a deep neural network. Finally, the sub-optimal task deployment, offloading and caching policies are obtained based on the training model. Simulation results demonstrate that the proposed algorithm is able to converge the model in very few iterations and results in a great improvement in terms of reducing system energy consumption and policy response delay compared to other algorithms.
Subject
Electrical and Electronic Engineering,Computer Networks and Communications,Hardware and Architecture,Signal Processing,Control and Systems Engineering
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献