🤖 AI Summary
Frequent retraining of machine learning models incurs substantial energy consumption and environmental impact, necessitating sustainable maintenance strategies that jointly optimize energy efficiency and predictive accuracy.
Method: This paper systematically quantifies the energy–accuracy trade-offs across full vs. incremental, time-triggered vs. event-triggered retraining paradigms. It proposes a lightweight, dual-driven retraining framework—guided by both data recency and change awareness—and introduces an on-demand retraining paradigm grounded in robust data drift detection.
Contribution/Results: Empirical benchmarking across multiple models and datasets, integrated with fine-grained energy monitoring and incremental learning, demonstrates that retraining solely on recent data reduces energy consumption by 25%; incorporating robust drift detection further achieves up to 40% energy reduction without compromising accuracy. The work establishes a reproducible methodology and empirical foundation for designing green AI systems.
📝 Abstract
The reliability of machine learning (ML) software systems is heavily influenced by changes in data over time. For that reason, ML systems require regular maintenance, typically based on model retraining. However, retraining requires significant computational demand, which makes it energy-intensive and raises concerns about its environmental impact. To understand which retraining techniques should be considered when designing sustainable ML applications, in this work, we study the energy consumption of common retraining techniques. Since the accuracy of ML systems is also essential, we compare retraining techniques in terms of both energy efficiency and accuracy. We showcase that retraining with only the most recent data, compared to all available data, reduces energy consumption by up to 25%, being a sustainable alternative to the status quo. Furthermore, our findings show that retraining a model only when there is evidence that updates are necessary, rather than on a fixed schedule, can reduce energy consumption by up to 40%, provided a reliable data change detector is in place. Our findings pave the way for better recommendations for ML practitioners, guiding them toward more energy-efficient retraining techniques when designing sustainable ML software systems.