Affiliation:
1. Friedrich-Alexander-Universität Erlangen-Nürnberg
2. Siemens AG, Digital Factory Division
Abstract
This paper addresses the problem of efficiently operating a flexible manufacturing machine in an electricity micro-grid featuring a high volatility of electricity prices. The problem of finding the optimal control policy is formulated as a sequential decision making problem under uncertainty where, at every time step the uncertainty comes from the lack of knowledge about fu-ture electricity consumption and future weather dependent energy prices. We propose to address this problem using deep reinforcement learning. To this purpose, we designed a deep learning architecture to forecast the load profile of future manufacturing schedule from past production time series. Combined with the forecast of future energy prices, the reinforcement-learning algorithm is trained to perform an online optimization of the production ma-chine in order to reduce the long-term energy costs. The concept is empirical-ly validated on a flexible production machine, where the machine speed can be optimized during the production.
Publisher
Trans Tech Publications, Ltd.
Reference34 articles.
1. Faktenpapier atypische Netznutzung. DIHK und VEA - Bundesverband der Energie-Abnehmer e. V. Hannover, April (2015).
2. Roesch M., Brugger M., Braunreuther S., Reinhart G.: Klassifizierung von Energieflexibilitätsmaßnahmen. ZWF. (2017).
3. R. S. Sutton and A. G. Barto, Introduction to Reinforcement Learning,, 1st ed. Cambridge, MA, USA: MIT Press, (1998).
4. Gholian, H. Mohsenian-Rad and Y. Hua, Optimal Industrial Load Control in Smart Grid,,IEEE Trans. Smart Grid, vol. 7, no. 5, pp.2305-2316, (2016).
5. R. Chen, H. Sun, Q. Guo, H. Jin, W. Wu and B. Zhang, Profit-seeking energy–intensive enterprises participating in power system scheduling: Model and mechanism,,Appl. Energy , vol. 158, pp.263-274, (2015).
Cited by
13 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献