Probabilistic Delay Forecasting in 5G Using Recurrent and Attention-Based Architectures

๐Ÿ“… 2025-03-19
๐Ÿ“ˆ Citations: 0
โœจ Influential: 0
๐Ÿ“„ PDF
๐Ÿค– AI Summary
To address the challenge of modeling and predicting end-to-end latency distributions under time-varying wireless links in 5G ultra-reliable low-latency communication (URLLC) scenarios, this paper proposes, for the first time, a full-probability-distribution-aware multi-step latency prediction framework. Methodologically, it innovatively integrates Transformer or LSTM architectures with a Mixture Density Network (MDN) based on Gaussian mixture models (GMM) to dynamically capture non-stationary latency distributions. Empirical validation is conducted on the OpenAirInterface 5G software-defined radio (SDR) platform, including precise time-synchronized latency data acquisition and preprocessing. Experimental results demonstrate that the proposed model outperforms both standard LSTM and feed-forward baselines in terms of negative log-likelihood and mean absolute error. Notably, it significantly improves latency guarantee accuracy at the 99.9% reliability targetโ€”enabling adaptive scheduling, intelligent resource allocation, and QoS-driven evolution toward 6G.

Technology Category

Application Category

๐Ÿ“ Abstract
With the emergence of new application areas such as cyber-physical systems and human-in-the-loop applications ensuring a specific level of end-to-end network latency with high reliability (e.g., 99.9%) is becoming increasingly critical. To align wireless links with these reliability requirements, it is essential to analyze and control network latency in terms of its full probability distribution. However, in a wireless link, the distribution may vary over time, making this task particularly challenging. We propose predicting the latency distribution using state-of-the-art data-driven techniques that leverage historical network information. Our approach tokenizes network state information and processes it using temporal deep-learning architectures-namely LSTM and Transformer models-to capture both short- and long-term delay dependencies. These models output parameters for a chosen parametric density via a mixture density network with Gaussian mixtures, yielding multi-step probabilistic forecasts of future delays. To validate our proposed approach, we implemented and tested these methods using a time-synchronized, SDR-based OpenAirInterface 5G testbed to collect and preprocess network-delay data. Our experiments show that the Transformer model achieves lower negative log-likelihood and mean absolute error than both LSTM and feed-forward baselines in challenging scenarios, while also providing insights into model complexity and training/inference overhead. This framework enables more informed decision-making for adaptive scheduling and resource allocation, paving the way toward enhanced QoS in evolving 5G and 6G networks.
Problem

Research questions and friction points this paper is trying to address.

Predicting latency distribution in 5G networks
Using LSTM and Transformer for delay forecasting
Enhancing QoS through adaptive scheduling and resource allocation
Innovation

Methods, ideas, or system contributions that make the work stand out.

Uses LSTM and Transformer for delay prediction
Implements mixture density network with Gaussian
Validates with OpenAirInterface 5G testbed
๐Ÿ”Ž Similar Papers
No similar papers found.