🤖 AI Summary
This paper presents the first systematic study of classification for interval-valued time series (IVTS). Addressing the gap that existing work focuses predominantly on regression while neglecting classification, we propose an end-to-end learnable convex combination modeling framework: interval bounds are adaptively weighted and fused into point-valued sequences, which are then transformed into images and fed into a fine-grained CNN for classification—unified for both univariate and multivariate IVTS. Key contributions include: (1) the first dedicated IVTS classification framework; (2) learnable, theoretically interpretable convex combination weights; (3) a novel multi-class margin-based generalization bound tailored for CNNs processing IVTS; and (4) ADMM-based optimization to enhance training stability. Extensive experiments on synthetic and real-world datasets demonstrate significant performance gains over state-of-the-art point-valued time series classifiers, validating the model’s effectiveness, robustness, and generalization capability.
📝 Abstract
In recent years, the modeling and analysis of interval-valued time series have garnered significant attention in the fields of econometrics and statistics. However, the existing literature primarily focuses on regression tasks while neglecting classification aspects. In this paper, we propose an adaptive approach for interval-valued time series classification. Specifically, we represent interval-valued time series using convex combinations of upper and lower bounds of intervals and transform these representations into images based on point-valued time series imaging methods. We utilize a fine-grained image classification neural network to classify these images, to achieve the goal of classifying the original interval-valued time series. This proposed method is applicable to both univariate and multivariate interval-valued time series. On the optimization front, we treat the convex combination coefficients as learnable parameters similar to the parameters of the neural network and provide an efficient estimation method based on the alternating direction method of multipliers (ADMM). On the theoretical front, under specific conditions, we establish a margin-based multiclass generalization bound for generic CNNs composed of basic blocks involving convolution, pooling, and fully connected layers. Through simulation studies and real data applications, we validate the effectiveness of the proposed method and compare its performance against a wide range of point-valued time series classification methods.