🤖 AI Summary
This work addresses the limitations of conventional self-supervised methods in time series modeling, which employ fixed masking ratios and struggle to adapt to varying noise levels, thereby constraining representational capacity. To overcome this, we propose the Flow-Guided Neural Operator (FGNO), which integrates neural operators with flow matching to learn multiscale time series representations in function space. FGNO treats noise intensity as a learnable degree of freedom, enhancing model flexibility while enabling deterministic, high-precision inference from clean inputs alone. Evaluated on three biomedical benchmarks—BrainTreeBank, DREAMT, and SleepEDF—FGNO substantially outperforms existing approaches, achieving up to a 35% improvement in AUROC, a 16% reduction in RMSE, and gains exceeding 20% in both accuracy and macro-F1 score.
📝 Abstract
Self-supervised learning (SSL) is a powerful paradigm for learning from unlabeled time-series data. However, popular methods such as masked autoencoders (MAEs) rely on reconstructing inputs from a fixed, predetermined masking ratio. Instead of this static design, we propose treating the corruption level as a new degree of freedom for representation learning, enhancing flexibility and performance. To achieve this, we introduce the Flow-Guided Neural Operator (FGNO), a novel framework combining operator learning with flow matching for SSL training. FGNO learns mappings in functional spaces by using Short-Time Fourier Transform to unify different time resolutions. We extract a rich hierarchy of features by tapping into different network layers and flow times that apply varying strengths of noise to the input data. This enables the extraction of versatile representations, from low-level patterns to high-level global features, using a single model adaptable to specific tasks. Unlike prior generative SSL methods that use noisy inputs during inference, we propose using clean inputs for representation extraction while learning representations with noise; this eliminates randomness and boosts accuracy. We evaluate FGNO across three biomedical domains, where it consistently outperforms established baselines. Our method yields up to 35% AUROC gains in neural signal decoding (BrainTreeBank), 16% RMSE reductions in skin temperature prediction (DREAMT), and over 20% improvement in accuracy and macro-F1 on SleepEDF under low-data regimes. These results highlight FGNO's robustness to data scarcity and its superior capacity to learn expressive representations for diverse time series.