Deep Deterministic Policy Gradient Reinforcement Learning Based Adaptive PID Load Frequency Control of an AC Micro-Grid Apprentissage par renforcement du …

K Sabahi, M Jamil… - … Canadian Journal of …, 2024 - ieeexplore.ieee.org
IEEE Canadian Journal of Electrical and Computer Engineering, 2024ieeexplore.ieee.org
The proportional, derivative, and integral (PID) controllers are commonly used in load
frequency control (LFC) problems in micro-grid (MG) systems with renewable energy
resources. However, fine-tuning these controllers is crucial for achieving a satisfactory
closed-loop response. In this study, we employed a deep deterministic policy gradient
(DDPG) reinforcement learning (RL) algorithm to adaptively adjust the PID controller
parameters, taking into account the uncertain characteristics of the MG system. The DDPG …
The proportional, derivative, and integral (PID) controllers are commonly used in load frequency control (LFC) problems in micro-grid (MG) systems with renewable energy resources. However, fine-tuning these controllers is crucial for achieving a satisfactory closed-loop response. In this study, we employed a deep deterministic policy gradient (DDPG) reinforcement learning (RL) algorithm to adaptively adjust the PID controller parameters, taking into account the uncertain characteristics of the MG system. The DDPG agent was trained until it achieved the maximum possible reward and to learn an optimal policy. Subsequently, the trained agent was utilized in an online manner to adaptively adjust the PID controller gains for managing the fuel-cell (FC) unit, wind turbine generator (WTG), and plug-in electric vehicle (PEV) battery to meet the load demand. We have conducted various simulation scenarios to compare the performance of the proposed adaptive RL-tuned PID controller with the fuzzy gain scheduling PID (FGSPID) controller. While both methods employ intelligent mechanisms to adjust the gains of the PID controllers, our proposed RL-based adaptive PID controller outperformed the FGSPID controller.
ieeexplore.ieee.org
以上显示的是最相近的搜索结果。 查看全部搜索结果