🤖 AI Summary
Existing univariate time series foundation models (Uni-TSFM) struggle to generalize directly to multivariate forecasting tasks. To address this limitation, this work proposes DualWeaver, a novel framework that leverages a pair of structurally symmetric, learnable proxy sequences to model inter-variable dependencies through a shared auxiliary feature fusion module, subsequently mapping them into Uni-TSFMs-compatible univariate sequences for prediction. The framework incorporates a parameter-free reconstruction mechanism and a theoretically grounded regularization term to effectively prevent adapter collapse and ensure stable training dynamics. Extensive experiments on multiple real-world datasets demonstrate that DualWeaver significantly outperforms current state-of-the-art methods, achieving leading performance in both forecasting accuracy and stability.
📝 Abstract
Time-series foundation models (TSFMs) have achieved strong univariate forecasting through large-scale pre-training, yet effectively extending this success to multivariate forecasting remains challenging. To address this, we propose DualWeaver, a novel framework that adapts univariate TSFMs (Uni-TSFMs) for multivariate forecasting by using a pair of learnable, structurally symmetric surrogate series. Generated by a shared auxiliary feature-fusion module that captures cross-variable dependencies, these surrogates are mapped to TSFM-compatible series via the forecasting objective. The symmetric structure enables parameter-free reconstruction of final predictions directly from the surrogates, without additional parametric decoding. A theoretically grounded regularization term is further introduced to enhance robustness against adaptation collapse. Extensive experiments on diverse real-world datasets show that DualWeaver outperforms state-of-the-art multivariate forecasters in both accuracy and stability. We release the code at https://github.com/li-jinpeng/DualWeaver.