🤖 AI Summary
This study addresses the challenge of sensor-induced data poisoning in training datasets for inter-turn short-circuit fault localization in power transformers, which hinders efficient mitigation of its adverse effects. To tackle this issue, the work introduces the SISA (Sharded, Isolated, Sliced, and Aggregated) framework—originally developed for machine unlearning—into the domain of power equipment fault diagnosis for the first time. By partitioning the training data into shards and training submodels in isolation, the approach enables selective retraining of only those submodels affected by contaminated data upon detection, thereby avoiding costly full-model retraining. Experimental results on simulated data demonstrate that the proposed method achieves diagnostic accuracy comparable to full retraining while substantially reducing retraining time, thus significantly enhancing model robustness and update efficiency.
📝 Abstract
In practical data-driven applications on electrical equipment fault diagnosis, training data can be poisoned by sensor failures, which can severely degrade the performance of machine learning (ML) models. However, once the ML model has been trained, removing the influence of such harmful data is challenging, as full retraining is both computationally intensive and time-consuming. To address this challenge, this paper proposes a SISA (Sharded, Isolated, Sliced, and Aggregated)-based machine unlearning (MU) framework for power transformer inter-turn short-circuit fault (ITSCF) localization. The SISA method partitions the training data into shards and slices, ensuring that the influence of each data point is isolated within specific constituent models through independent training. When poisoned data are detected, only the affected shards are retrained, avoiding retraining the entire model from scratch. Experiments on simulated ITSCF conditions demonstrate that the proposed framework achieves almost identical diagnostic accuracy to full retraining, while reducing retraining time significantly.