Reinforcement Learning based Proactive Control for Enabling Power Grid Resilience to Wildfire


Industrial electric power grid operation subject to an extreme event requires decision-making by human operators under stressful conditions. Decision making using system data informatics under adverse dynamic events, especially if forecasted, should be supplemented by intelligent proactive control. Power transmission system operation during wildfires requires resiliency-driven proactive control for load shedding, line switching, and resource allocation considering the dynamics of the wildfire and failure propagation to minimize the impact on the system. However, the possible number of line and load switching in an extensive industrial system during an event make traditional prediction-driven and stochastic approaches computationally intractable, leading operators to often use pre-planned or greedy algorithms. In this work, we model and solve the proactive control problem as a Markov decision process and introduce an integrated testbed for spatio-temporal wildfire propagation and proactive power-system operation. Our approach allows the controller to provide setpoints for all generation fleets in the power grid. We evaluate our approach utilizing the IEEE test system mapped onto a hypothetical terrain. Our results show that the proposed approach can help the operator to reduce load outage during an extreme event. It reduces power flow through lines that are to be de-energized, and adjusts the load demand by increasing power flow through other lines.

IEEE Transactions on Industrial Informatics, Accepted for Publication
Impact Factor: 11.65 (2023)
Salah Uddin Kadir
Salah Uddin Kadir
Ph.D. student
Aron Laszka
Aron Laszka
Assistant Professor