Learning Soft Sparse Shapes for Efficient Time-Series Classification

📅 2025-05-11
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing shapelet-based time series classification methods rely on hard-thresholded feature selection, which risks discarding informative subsequences and ignores the heterogeneous contributions of different shapelets to classification. This paper proposes a soft-sparse shapelet modeling framework that abandons binary selection, instead retaining and differentially weighting all candidate subsequences. Our approach features: (1) dual modules for soft shapelet sparsification and soft shapelet learning; (2) a learnable, class-specific router that governs an expert network for fine-grained, interpretable pattern modeling; and (3) end-to-end differentiable shapelet optimization integrating soft attention weighting, class-conditional routing, shared experts, and shapelet-to-sequence transformation. Evaluated on multiple multiclass benchmark datasets, our method achieves significant improvements over state-of-the-art approaches, simultaneously attaining higher classification accuracy and clear, shapelet-level interpretability.

Technology Category

Application Category

📝 Abstract
Shapelets are discriminative subsequences (or shapes) with high interpretability in time series classification. Due to the time-intensive nature of shapelet discovery, existing shapelet-based methods mainly focus on selecting discriminative shapes while discarding others to achieve candidate subsequence sparsification. However, this approach may exclude beneficial shapes and overlook the varying contributions of shapelets to classification performance. To this end, we propose a extbf{Soft} sparse extbf{Shape}s ( extbf{SoftShape}) model for efficient time series classification. Our approach mainly introduces soft shape sparsification and soft shape learning blocks. The former transforms shapes into soft representations based on classification contribution scores, merging lower-scored ones into a single shape to retain and differentiate all subsequence information. The latter facilitates intra- and inter-shape temporal pattern learning, improving model efficiency by using sparsified soft shapes as inputs. Specifically, we employ a learnable router to activate a subset of class-specific expert networks for intra-shape pattern learning. Meanwhile, a shared expert network learns inter-shape patterns by converting sparsified shapes into sequences. Extensive experiments show that SoftShape outperforms state-of-the-art methods and produces interpretable results.
Problem

Research questions and friction points this paper is trying to address.

Improves efficiency in time-series classification using soft shapes
Addresses limitations of traditional shapelet-based sparsification methods
Enhances interpretability and retains beneficial subsequence information
Innovation

Methods, ideas, or system contributions that make the work stand out.

Soft shape sparsification retains all subsequence information
Learnable router activates class-specific expert networks
Shared expert network learns inter-shape patterns
🔎 Similar Papers
No similar papers found.
Z
Zhen Liu
Institute for Infocomm Research, Agency for Science, Technology and Research, Singapore
Y
Yicheng Luo
School of Computer Science and Engineering, South China University of Technology, Guangzhou, China
B
Boyuan Li
School of Computer Science and Engineering, South China University of Technology, Guangzhou, China
Emadeldeen Eldele
Emadeldeen Eldele
Assistant Professor, Khalifa University
time seriesself-supervised learningdeep learningdomain adaptationEEG
Min Wu
Min Wu
Professor, IEEE Fellow, China University of Geosciences
Process controlRobust controlIntelligent systems
Q
Qianli Ma
School of Computer Science and Engineering, South China University of Technology, Guangzhou, China