skip to main content

Energy optimization management of microgrid using improved soft actor-critic algorithm

1Electric Power Dispatching & Control Center of Guangdong Power Grid, Guangzhou 510600, China

2Guangdong Provincial Key Laboratory of Smart Grid New Technology Enterprises, China Southern Power Grid Technology Co.,Ltd., Guangzhou 510180, China

Received: 1 Dec 2023; Revised: 26 Jan 2024; Accepted: 20 Feb 2024; Available online: 28 Feb 2024; Published: 1 Mar 2024.
Editor(s): H Hadiyanto
Open Access Copyright (c) 2024 The Author(s). Published by Centre of Biomass and Renewable Energy (CBIORE)
Creative Commons License This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

Citation Format:

To tackle the challenges associated with variability and uncertainty in distributed power generation, as well as the complexities of solving high-dimensional energy management mathematical models in mi-crogrid energy optimization, a microgrid energy optimization management method is proposed based on an improved soft actor-critic algorithm. In the proposed method, the improved soft actor-critic algorithm employs an entropy-based objective function to encourage target exploration without assigning signifi-cantly higher probabilities to any part of the action space, which can simplify the analysis process of distributed power generation variability and uncertainty while effectively mitigating the convergence fragility issues in solving the high-dimensional mathematical model of microgrid energy management. The effectiveness of the proposed method is validated through a case study analysis of microgrid energy op-timization management. The results revealed an increase of 51.20%, 52.38%, 13.43%, 16.50%, 58.26%, and 36.33% in the total profits of a microgrid compared with the Deep Q-network algorithm, the state-action-reward-state-action algorithm, the proximal policy optimization algorithm, the ant-colony based algorithm, a microgrid energy optimization management strategy based on the genetic algorithm and the fuzzy inference system, and the theoretical retailer stragety, respectively. Additionally, com-pared with other methods and strategies, the proposed method can learn more optimal microgrid energy management behaviors and anticipate fluctuations in electricity prices and demand.

Fulltext View|Download
Keywords: Energy optimization management; Electricity rate; Microgrid; Reinforcement learning; Soft actor-critic algorithm

Article Metrics:

  1. Ahmad, S., Shafiullah, M., Ahmed, C. B., & Alowaifeer, M. (2023). A Review of Microgrid Energy Management and Control Strategies. IEEE Access, 11, 21729-21757.
  2. Alabdullah, M. H., & Abido, M. A. (2022). Microgrid energy management using deep Q-network reinforcement learning. Alexandria Engineering Journal, 61(11), 9069-9078.
  3. Alamir, N., Kamel, S., Hassan, M. H., & Abdelkader, S. M. (2023). An improved weighted mean of vectors algorithm for microgrid energy management considering demand response. Neural Computing and Applications, 35(28), 20749-20770.
  4. Alavizadeh, H., Alavizadeh, H., & Jang-Jaccard, J. (2022). Deep Q-learning based reinforcement learning approach for network intrusion detection. Computers, 11(3), 41.
  5. Alzahrani, A., Sajjad, K., Hafeez, G., Murawwat, S., Khan, S., & Khan, F. A. (2023). Real-time energy optimization and scheduling of buildings integrated with renewable microgrid. Applied Energy, 335, 120640.
  6. Bao, G., & Xu, R. (2023). A Data-Driven Energy Management Strategy Based on Deep Reinforcement Learning for Microgrid Systems. Cognitive Computation, 15(2), 739-750.
  7. Barnawi, A. B. (2023). Development and Analysis of Optimization Algorithm for Demand-Side Management Considering Optimal Generation Scheduling and Power Flow in Grid-Connected AC/DC Microgrid. Sustainability, 15(21), 15671.
  8. Cavus, M., Allahham, A., Adhikari, K., Zangiabadi, M., & Giaouris, D. (2023). Energy Management of Grid-Connected Microgrids Using an Optimal Systems Approach. IEEE Access, 11, 9907-9919.
  9. Datta, J., & Das, D. (2023). Energy management of multi-microgrids with renewables and electric vehicles considering price-elasticity based demand response: A bi-level hybrid optimization approach. Sustainable Cities and Society, 99, 104908.
  10. Dey, B., Misra, S., & Marquez, F. P. G. (2023). Microgrid system energy management with demand response program for clean and economical operation. Applied Energy, 334, 120717.
  11. Dong, Y., Zhang, H., Wang, C., & Zhou, X. (2024). Soft actor-critic DRL algorithm for interval optimal dispatch of integrated energy systems with uncertainty in demand response and renewable energy. Engineering Applications of Artificial Intelligence, 127, 107230.
  12. Du, A., & Ghavidel, A. (2022). Parameterized deep reinforcement learning-enabled maintenance decision-support and life-cycle risk assessment for highway bridge portfolios. Structural Safety, 97, 102221.
  13. El Bourakadi, D., Yahyaouy, A., & Boumhidi, J. (2022). Intelligent energy management for micro-grid based on deep learning LSTM prediction model and fuzzy decision-making. Sustainable Computing: Informatics and Systems, 35, 100709.
  14. Gao, X. Y., Chen, B., & Huang, S. B. (2023). Energy management strategy for pelagic island microgrid considering leader-follower game. Proceedings of the CSU-EPSA, 35(02): 1-9.
  15. Guo, C., Wang, X., Zheng, Y., & Zhang, F. (2022). Real-time optimal energy management of microgrid with uncertainties based on deep reinforcement learning. Energy, 238, 121873.
  16. Han, X., Mu, C., Yan, J., & Niu, Z. (2023). An autonomous control technology based on deep reinforcement learning for optimal active power dispatch. International Journal of Electrical Power & Energy Systems, 145, 108686.
  17. Hou, Y. Y., Zeng, J., Luo, Y., & Liu, J. F. (2023). Research on collaborative and optimization methods of active energy management in community microgrid. Power System Technology, 47(04), 1548-1557.
  18. Huang, Y., Li, G., Chen, C., Bian, Y., Qian, T., & Bie, Z. (2022). Resilient distribution networks by microgrid formation using deep reinforcement learning. IEEE Transactions on Smart Grid, 13(6), 4918-4930.
  19. Hu, C., Cai, Z., Zhang, Y., Yan, R., Cai, Y., & Cen, B. (2022). A soft actor-critic deep reinforcement learning method for multi-timescale coordinated operation of microgrids. Protection and Control of Modern Power Systems, 7(1), 1-10.
  20. Huang, J., Bi, Y., Qin, L., Zang, G., & Li, H. (2021). Design parameters and dynamic performance analysis of a high efficient solar-ground source cooling system using parabolic trough collector. International Journal of Sustainable Energy, 40(3), 253-282.
  21. Ibrahim, M. M., Hasanien, H. M., Farag, H. E., & Orman, W. A. (2023). Energy management of multi-area islanded hybrid microgrids: a stochastic approach. IEEE Access. 11, 101409 - 101424.
  22. Jahani, A., Zare, K., & Khanli, L. M. (2023). Short-term load forecasting for microgrid energy management system using hybrid SPM-LSTM. Sustainable Cities and Society, 98, 104775.
  23. Kheiter, A., Souag, S., Chaouch, A., Boukortt, A., Bekkouche, B., & Guezgouz, M. (2022). Energy Management Strategy Based on Marine Predators Algorithm for Grid-Connected Microgrid. International Journal of Renewable Energy Development, 11(3), 751-765.
  24. Kim, B., & Kim, J. (2023). Efficient Difficulty Level Balancing in Match-3 Puzzle Games: A Comparative Study of Proximal Policy Optimization and Soft Actor-Critic Algorithms. Electronics, 12(21), 4456.
  25. Kosuru, V. S. R., & Venkitaraman, A. K. (2022). Developing a deep Q-learning and neural network framework for trajectory planning. European Journal of Engineering and Technology Research, 7(6), 148-157.
  26. Lagouir, M., Badri, A., & Sayouti, Y. (2021). Solving Multi-Objective Energy Management of a DC Microgrid using Multi-Objective Multiverse Optimization. International Journal of Renewable Energy Development, 10(4), 911-922.
  27. Leonori, S., Paschero, M., Mascioli, F. M. F., & Rizzi, A. (2020). Optimization strategies for Microgrid energy management systems by Genetic Algorithms. Applied Soft Computing, 86, 105903.
  28. Moos, J., Hansel, K., Abdulsamad, H., Stark, S., Clever, D., & Peters, J. (2022). Robust reinforcement learning: A review of foundations and recent advances. Machine Learning and Knowledge Extraction, 4(1), 276-315.
  29. Mostefa, A., Belalia, K., Lantri, T., Boulouiha, H. M., & Allali, A. (2023). A four-line active shunt filter to enhance the power quality in a microgrid. International Journal of Renewable Energy Development, 12(3): 488-498.
  30. Park, S., Pozzi, A., Whitmeyer, M., Perez, H., Kandel, A., Kim, G., & Moura, S. (2022). A deep reinforcement learning framework for fast charging of li-ion batteries. IEEE Transactions on Transportation Electrification, 8(2), 2770-2784.
  31. Nakabi, T. A., & Toivanen, P. (2019). An ANN-based model for learning individual customer behavior in response to electricity prices. Sustainable Energy, Grids and Networks, 18, 100212.
  32. Nakabi, T. A., & Toivanen, P. (2021). Deep reinforcement learning for energy management in a microgrid with flexible demand. Sustainable Energy, Grids and Networks, 25, 100413.
  33. NREL, Wind data and tools,
  34. Rodriguez, M., Arcos–Aviles, D., & Martinez, W. (2023). Fuzzy logic-based energy management for isolated microgrid using meta-heuristic optimization algorithms. Applied Energy, 335, 120771.
  35. Saeed, M. H., Rana, S., El-Bayeh, C. Z., & Fangzong, W. (2023). Demand response based microgrid's economic dispatch. International Journal of Renewable Energy Development, 12(4): 749-759.
  36. Shavandi, A., & Khedmati, M. (2022). A multi-agent deep reinforcement learning framework for algorithmic trading in financial markets. Expert Systems with Applications, 208, 118124.
  37. Shezan, S. A., Kamwa, I., Ishraque, M. F., Muyeen, S. M., Hasan, K. N., Saidur, R., ... & Al-Sulaiman, F. A. (2023). Evaluation of different optimization techniques and control strategies of hybrid microgrid: a review. Energies, 16(4), 1792.
  38. Singh, V., Nanavati, B., Kar, A. K., & Gupta, A. (2023). How to maximize clicks for display advertisement in digital marketing? A reinforcement learning approach. Information Systems Frontiers, 25(4), 1621-1638.
  39. Suresh, V., Janik, P., Jasinski, M., Guerrero, J. M., & Leonowicz, Z. (2023). Microgrid energy management using metaheuristic optimization algorithms. Applied Soft Computing, 134, 109981.
  40. Sun, W., Zou, Y., Zhang, X., Guo, N., Zhang, B., & Du, G. (2022). High robustness energy management strategy of hybrid electric vehicle based on improved soft actor-critic deep reinforcement learning. Energy, 281: 124806.
  41. Tajjour, S., & Chandel, S. S. (2023). A comprehensive review on sustainable energy management systems for optimal operation of future-generation of solar microgrids. Sustainable Energy Technologies and Assessments, 58, 103377.
  42. Topa Gavilema, Á. O., Álvarez, J. D., Torres Moreno, J. L., & García, M. P. (2021). Towards optimal management in microgrids: An overview. Energies, 14(16), 5202.
  43. Tian, Y., Li, X., Ma, H., Zhang, X., Tan, K. C., & Jin, Y. (2022). Deep reinforcement learning based adaptive operator selection for evolutionary multi-objective optimization. IEEE Transactions on Emerging Topics in Computational Intelligence. 7(4): 1051 - 1064.
  44. Tightiz, L., & Yang, H. (2021). Resilience microgrid as power system integrity protection scheme element with reinforcement learning based management. IEEE Access, 9, 83963-83975.
  45. Topa Gavilema, Á. O., Álvarez, J. D., Torres Moreno, J. L., & García, M. P. (2021). Towards optimal management in microgrids: An overview. Energies, 14(16), 5202.
  46. Zhang, B., Hu, W., Ghias, A. M., Xu, X., & Chen, Z. (2023). Multi-agent deep reinforcement learning based distributed control architecture for interconnected multi-energy microgrid energy management and optimization. Energy Conversion and Management, 277, 116647.
  47. Xiong, K., Cao, D., Zhang, G., Chen, Z., & Hu, W. (2023). Coordinated volt/VAR control for photovoltaic inverters: A soft actor-critic enhanced droop control approach. International Journal of Electrical Power & Energy Systems, 149, 109019.
  48. Xu, H., Wu, Q., Wen, J., & Yang, Z. (2022). Joint bidding and pricing for electricity retailers based on multi-task deep reinforcement learning. International journal of electrical power & energy systems, 138, 107897.
  49. Zhang, Z., Dou, C., Yue, D., & Zhang, B. (2021). Predictive voltage hierarchical controller design for islanded microgrids under limited communication. IEEE Transactions on Circuits and Systems I: Regular Papers, 69(2), 933-945.
  50. Zheng, Y., Tao, J., Sun, Q., Sun, H., Chen, Z., & Sun, M. (2023). Deep reinforcement learning based active disturbance rejection load frequency control of multi-area interconnected power systems with renewable energy. Journal of the Franklin Institute, 360(17), 13908-13931.
  51. Zhu, Q., Su, S., Tang, T., Liu, W., Zhang, Z., & Tian, Q. (2022). An eco-driving algorithm for trains through distributing energy: A Q-Learning approach. ISA transactions, 122, 24-37.

Last update:

No citation recorded.

Last update: 2024-04-12 15:02:15

No citation recorded.