LAST SToP For Modeling Asynchronous Time Series

📅 2025-02-04
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This paper addresses the challenge of effectively modeling asynchronous time series with large language models (LLMs). We propose an event-driven cross-task analytical framework. Methodologically, we introduce (1) a natural-language-based event encoding scheme specifically designed for asynchronous time series, which jointly maps irregular timestamps and event semantics into LLM-compatible textual inputs; and (2) stochastic soft prompting—a lightweight, weight-free adaptation technique that dynamically optimizes prompt representations without parameter fine-tuning, thereby circumventing restrictive alignment assumptions and obviating full-parameter or QLoRA-based fine-tuning. Our approach achieves state-of-the-art performance across three core tasks—forecasting, anomaly detection, and imputation—on multiple real-world asynchronous time-series benchmarks, consistently outperforming existing fine-tuning methods.

Technology Category

Application Category

📝 Abstract
We present a novel prompt design for Large Language Models (LLMs) tailored to Asynchronous Time Series. Unlike regular time series, which assume values at evenly spaced time points, asynchronous time series consist of timestamped events occurring at irregular intervals, each described in natural language. Our approach effectively utilizes the rich natural language of event descriptions, allowing LLMs to benefit from their broad world knowledge for reasoning across different domains and tasks. This allows us to extend the scope of asynchronous time series analysis beyond forecasting to include tasks like anomaly detection and data imputation. We further introduce Stochastic Soft Prompting, a novel prompt-tuning mechanism that significantly improves model performance, outperforming existing fine-tuning methods such as QLoRA. Through extensive experiments on real world datasets, we demonstrate that our approach achieves state-of-the-art performance across different tasks and datasets.
Problem

Research questions and friction points this paper is trying to address.

Modeling asynchronous time series with LLMs
Extending analysis to anomaly detection
Improving performance with Stochastic Soft Prompting
Innovation

Methods, ideas, or system contributions that make the work stand out.

Novel prompt design for LLMs
Stochastic Soft Prompting mechanism
Extends asynchronous time series analysis
🔎 Similar Papers
No similar papers found.
S
Shubham Gupta
Université Laval
Thibaut Durand
Thibaut Durand
Machine Learning Researcher @ Borealis AI
Machine LearningComputer VisionDeep LearningWeakly Supervised LearningTemporal Modeling
G
Graham Taylor
Vector Institute
L
Lilian W. Białokozowicz
RBC Borealis