🤖 AI Summary
This study addresses the challenge of coordinated optimization of electric vehicle (EV) charging and residential energy management for prosumer households with high photovoltaic (PV) penetration but limited battery storage capacity, aiming to alleviate grid stress and support system decarbonization. Using hourly real-world data from 90 German-speaking households over a full year, augmented with synthetic simulations, we comparatively evaluate deep reinforcement learning (DRL), model predictive control (MPC), and rule-based strategies. Our key contribution is the first empirical demonstration that, under low-battery-capacity conditions, DRL significantly improves PV self-consumption (+12.3%) and cost savings (+9.8%) over MPC and rule-based baselines. Critical control levers identified include frequent short-duration EV charging, early grid connection, and prioritized use of PV surplus. Conversely, algorithmic gains vanish as battery capacity increases, thereby empirically delineating the operational boundary of DRL’s effectiveness in residential prosumer energy management.
📝 Abstract
Efficient energy management in prosumer households is key to alleviating grid stress in an energy transition marked by electric vehicles (EV), renewable energies and battery storage. However, it is unclear how households optimize prosumer EV charging. Here we study real-world data from 90 households on fixed-rate electricity tariffs in German-speaking countries to investigate the potential of Deep Reinforcement Learning (DRL) and other control approaches (Rule-Based, Model Predictive Control) to manage the dynamic and uncertain environment of Home Energy Management (HEM) and optimize household charging patterns. The DRL agent efficiently aligns charging of EV and battery storage with photovoltaic (PV) surplus. We find that frequent EV charging transactions, early EV connections and PV surplus increase optimization potential. A detailed analysis of nine households (1 hour resolution, 1 year) demonstrates that high battery capacity facilitates self optimization; in this case further algorithmic control shows little value. In cases with relatively low battery capacity, algorithmic control with DRL improves energy management and cost savings by a relevant margin. This result is further corroborated by our simulation of a synthetic household. We conclude that prosumer households with optimization potential would profit from DRL, thus benefiting also the full electricity system and its decarbonization.