Interpretable Data-Driven Anomaly Detection in Industrial Processes with ExIFFI

πŸ“… 2024-05-02
πŸ›οΈ International Forum on Research and Technologies for Society and Industry Leveraging a better tomorrow
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
To address the lack of interpretability in industrial process anomaly detection, this paper proposes the first method to adapt the ExIFFI interpretability framework to industrial settings, unifying anomaly identification with root-cause attribution. The approach extends Isolation Forest (EIF) by integrating quantitative feature importance estimation and instance-level counterfactual analysis, yielding fast, efficient, and locally interpretable attributions for anomaly decisions. Evaluated on two public industrial datasets, it significantly outperforms existing interpretable anomaly detection models in both explanation fidelity and computational efficiency, while maintaining high detection accuracy and real-time capability. Its core contributions are: (i) the first successful adaptation of ExIFFI to industrial anomaly detection; (ii) overcoming the limitations of black-box decision-making; and (iii) fulfilling Industry 5.0’s demand for trustworthy, human-understandable AI through transparent, actionable explanations.

Technology Category

Application Category

πŸ“ Abstract
Anomaly detection (AD) is a crucial process often required in industrial settings. Anomalies can signal underlying issues within a system, prompting further investigation. Industrial processes aim to streamline operations as much as possible, encompassing the production of the final product, making AD an essential mean to reach this goal. Conventional anomaly detection methodologies typically clas-sify observations as either normal or anomalous without providing insight into the reasons behind these classifications. Consequently, in light of the emergence of Industry 5.0, a more desirable approach involves providing interpretable outcomes, enabling users to understand the rationale behind the results. This paper presents the first industrial application of ExIFFI, a recently developed approach focused on the production of fast and efficient explanations for the Extended Isolation Forest (EIF) Anomaly detection method. ExIFFI is tested on two publicly available industrial datasets demonstrating superior effectiveness in explanations and computational efficiency with the respect to other state-of-the-art explainable AD models.
Problem

Research questions and friction points this paper is trying to address.

Interpretable anomaly detection in industrial processes
Addressing lack of insights in conventional AD methods
Providing explainable outcomes for Industry 5.0 requirements
Innovation

Methods, ideas, or system contributions that make the work stand out.

ExIFFI for Extended Isolation Forest explanations
Tested on three industrial datasets
Superior explanation effectiveness and efficiency
πŸ”Ž Similar Papers
Davide Frizzo
Davide Frizzo
UniversitΓ  degli Studi di Padova
Machine LearningAnomaly DetectionPredictive Maintenance
Francesco Borsatti
Francesco Borsatti
PhD Student, University of Padua
Machine LearningMechatronics
A
Alessio Arcudi
Department of Information Engineering, University of Padova
A
Antonio De Moliner
Zoppas Industries
Roberto Oboe
Roberto Oboe
University of Padova
Motion ControlMEMS inertial sensorsRehabilitation Robotics
G
Gian Antonio Susto
Department of Information Engineering, University of Padova