EExApp: GNN-Based Reinforcement Learning for Radio Unit Energy Optimization in 5G O-RAN

📅 2026-02-09
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work proposes EExApp, a deep reinforcement learning-based xApp designed for the O-RAN architecture to reduce energy consumption in 5G radio units (RUs) while ensuring quality of service (QoS). EExApp jointly optimizes RU sleep scheduling and distributed unit (DU) resource slicing through a novel dual-actor dual-critic proximal policy optimization (PPO) framework. It leverages a graph attention network (GAT) to model inter-RU coordination and integrates a Transformer encoder to handle dynamic user loads. By incorporating dual-objective feedback—balancing energy efficiency and QoS—into the actor updates, the method adaptively trades off between these competing goals. Experimental results on a real-world 5G O-RAN testbed demonstrate that EExApp significantly reduces RU energy consumption under strict QoS constraints, outperforming existing approaches.

Technology Category

Application Category

📝 Abstract
With over 3.5 million 5G base stations deployed globally, their collective energy consumption (projected to exceed 131 TWh annually) raises significant concerns over both operational costs and environmental impacts. In this paper, we present EExAPP, a deep reinforcement learning (DRL)-based xApp for 5G Open Radio Access Network (O-RAN) that jointly optimizes radio unit (RU) sleep scheduling and distributed unit (DU) resource slicing. EExAPP uses a dual-actor-dual-critic Proximal Policy Optimization (PPO) architecture, with dedicated actor-critic pairs targeting energy efficiency and quality-of-service (QoS) compliance. A transformer-based encoder enables scalable handling of variable user equipment (UE) populations by encoding all-UE observations into fixed-dimensional representations. To coordinate the two optimization objectives, a bipartite Graph Attention Network (GAT) is used to modulate actor updates based on both critic outputs, enabling adaptive tradeoffs between power savings and QoS. We have implemented EExAPP and deployed it on a real-world 5G O-RAN testbed with live traffic, commercial RU and smartphones. Extensive over-the-air experiments and ablation studies confirm that EExAPP significantly outperforms existing methods in reducing the energy consumption of RU while maintaining QoS.
Problem

Research questions and friction points this paper is trying to address.

energy optimization
5G O-RAN
radio unit
quality-of-service
base station energy consumption
Innovation

Methods, ideas, or system contributions that make the work stand out.

Graph Attention Network
Proximal Policy Optimization
Transformer Encoder
O-RAN xApp
Energy-QoS Tradeoff
🔎 Similar Papers
No similar papers found.
J
Jie Lu
Department of Computer Science and Engineering, Michigan State University, USA
P
Peihao Yan
Department of Computer Science and Engineering, Michigan State University, USA
Huacheng Zeng
Huacheng Zeng
Michigan State University
5GO-RANmachine learningradio sensingwireless security