Affiliation:
1. College of Electrical Engineering Sichuan University Chengdu China
2. Xiaogan Electric Power Supply Company State Grid Hubei Electric Power Company Limited Xiaogan Hubei Province China
3. School of Mechanical and Electrical Engineering University of Electronic Science and Technology of China Chengdu China
Abstract
AbstractThe variability of renewable energy within microgrids (MGs) necessitates the smoothing of power fluctuations through the effective scheduling of internal power equipment. Otherwise, significant power variations on the tie‐line connecting the MG to the main power grid could occur. This study introduces an innovative scheduling strategy that utilizes a data‐driven approach, employing a deep reinforcement learning algorithm to achieve this smoothing effect. The strategy prioritizes the scheduling of MG's internal power devices, taking into account the stochastic charging patterns of electric vehicles. The scheduling optimization model is initially described as a Markov decision process with the goal of minimizing power fluctuations on the interconnection lines and operational costs of the MG. Subsequently, after preprocessing the historical operational data of the MG, an enhanced scheduling strategy is developed through a neural network learning process. Finally, the results from four scheduling scenarios demonstrate the significant impact of the proposed strategy. Comparisons of reward curves before and after data preprocessing underscore its importance. In contrast to optimization results from deep deterministic policy gradient, soft actor‐critic, and particle swarm optimization algorithms, the superiority of the deep deterministic policy gradient algorithm with the addition of a priority experience replay mechanism is highlighted.
Funder
National Natural Science Foundation of China
Publisher
Institution of Engineering and Technology (IET)