Rivaling Transformers: Multi-Scale Structured State-Space Mixtures for Agentic 6G O-RAN

📅 2025-10-06
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the challenge of short-term, user-level KPI forecasting for near-real-time intelligent control in 6G O-RAN—under multi-timescale dynamics and resource constraints—this paper proposes MS³M, a lightweight Multi-Scale Structured State-Space Mixture model. MS³M innovatively integrates the HiPPO-LegS kernel with bilinear discretization to capture complex temporal dynamics, while incorporating squeeze-and-excitation gating and channel shuffling to enhance expressiveness with minimal overhead. Coupled with depthwise separable convolutions and a sliding-window training strategy, it enables low-latency time-series modeling. Evaluated on a custom-built O-RAN testbed, MS³M achieves only 0.7M parameters and an inference latency of 57 ms—3–10× faster than Transformer baselines—while maintaining competitive prediction accuracy.

Technology Category

Application Category

📝 Abstract
In sixth-generation (6G) Open Radio Access Networks (O-RAN), proactive control is preferable. A key open challenge is delivering control-grade predictions within Near-Real-Time (Near-RT) latency and computational constraints under multi-timescale dynamics. We therefore cast RAN Intelligent Controller (RIC) analytics as an agentic perceive-predict xApp that turns noisy, multivariate RAN telemetry into short-horizon per-User Equipment (UE) key performance indicator (KPI) forecasts to drive anticipatory control. In this regard, Transformers are powerful for sequence learning and time-series forecasting, but they are memory-intensive, which limits Near-RT RIC use. Therefore, we need models that maintain accuracy while reducing latency and data movement. To this end, we propose a lightweight Multi-Scale Structured State-Space Mixtures (MS3M) forecaster that mixes HiPPO-LegS kernels to capture multi-timescale radio dynamics. We develop stable discrete state-space models (SSMs) via bilinear (Tustin) discretization and apply their causal impulse responses as per-feature depthwise convolutions. Squeeze-and-Excitation gating dynamically reweights KPI channels as conditions change, and a compact gated channel-mixing layer models cross-feature nonlinearities without Transformer-level cost. The model is KPI-agnostic -- Reference Signal Received Power (RSRP) serves as a canonical use case -- and is trained on sliding windows to predict the immediate next step. Empirical evaluations conducted using our bespoke O-RAN testbed KPI time-series dataset (59,441 windows across 13 KPIs). Crucially for O-RAN constraints, MS3M achieves a 0.057 s per-inference latency with 0.70M parameters, yielding 3-10x lower latency than the Transformer baselines evaluated on the same hardware, while maintaining competitive accuracy.
Problem

Research questions and friction points this paper is trying to address.

Proactive control in 6G O-RAN requires low-latency multi-timescale predictions
Transformers are too memory-intensive for Near-RT RAN Intelligent Controllers
Models must maintain accuracy while reducing latency and computational costs
Innovation

Methods, ideas, or system contributions that make the work stand out.

Multi-scale state-space mixtures capture radio dynamics
Bilinear discretization enables stable sequence modeling
Gated mechanisms replace Transformers to reduce latency
🔎 Similar Papers
No similar papers found.