L2GTX: From Local to Global Time Series Explanations

📅 2026-03-13
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing explainable AI methods struggle to generate global explanations for time series classification that simultaneously capture temporal dependencies and recurring cross-instance patterns, often being limited to specific models or unsuitable for temporal data. This work proposes L2GTX, the first model-agnostic framework for global time series explanation. L2GTX first produces local explanations via LOMATCE, extracting and parameterizing temporal event primitives—such as trends and extrema—and then constructs compact, human-readable, high-fidelity class-level global explanations through cross-instance clustering, redundancy reduction, and selection of representative instances. Experiments on six benchmark datasets demonstrate that L2GTX significantly enhances explanation conciseness and interpretability while maintaining stable global fidelity.

Technology Category

Application Category

📝 Abstract
Deep learning models achieve high accuracy in time series classification, yet understanding their class-level decision behaviour remains challenging. Explanations for time series must respect temporal dependencies and identify patterns that recur across instances. Existing approaches face three limitations: model-agnostic XAI methods developed for images and tabular data do not readily extend to time series, global explanation synthesis for time series remains underexplored, and most existing global approaches are model-specific. We propose L2GTX, a model-agnostic framework that generates class-wise global explanations by aggregating local explanations from a representative set of instances. L2GTX extracts clusters of parameterised temporal event primitives, such as increasing or decreasing trends and local extrema, together with their importance scores from instance-level explanations produced by LOMATCE. These clusters are merged across instances to reduce redundancy, and an instance-cluster importance matrix is used to estimate global relevance. Under a user-defined instance selection budget, L2GTX selects representative instances that maximise coverage of influential clusters. Events from the selected instances are then aggregated into concise class-wise global explanations. Experiments on six benchmark time series datasets show that L2GTX produces compact and interpretable global explanations while maintaining stable global faithfulness measured as mean local surrogate fidelity.
Problem

Research questions and friction points this paper is trying to address.

time series explanation
global explanation
model-agnostic XAI
temporal dependencies
class-level interpretation
Innovation

Methods, ideas, or system contributions that make the work stand out.

time series explanation
global explanation
model-agnostic XAI
temporal event primitives
instance selection
🔎 Similar Papers
No similar papers found.