Autonomous Task Offloading of Vehicular Edge Computing with Parallel Computation Queues

📅 2025-09-04
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the coexisting challenges of high waiting latency, uneven resource utilization, and load congestion in vehicular edge computing (VEC) task offloading, this paper proposes a task-coordinated offloading mechanism based on parallel computation queues. Methodologically, it introduces: (i) an instantaneous edge server processing capacity prediction model integrated with discrete queue-state modeling to dynamically and accurately identify overloaded nodes; and (ii) a network-coordinated parallel queue scheduling strategy that jointly optimizes latency reduction and global load balancing. Theoretical analysis leverages queuing theory and parallel computation models, while simulations are conducted in a virtual environment driven by real-world road topology. Results demonstrate that the proposed scheme reduces average waiting latency by 21.6%–34.8% compared to state-of-the-art approaches, while maintaining stable robustness under highly dynamic vehicular traffic conditions.

Technology Category

Application Category

📝 Abstract
This work considers a parallel task execution strategy in vehicular edge computing (VEC) networks, where edge servers are deployed along the roadside to process offloaded computational tasks of vehicular users. To minimize the overall waiting delay among vehicular users, a novel task offloading solution is implemented based on the network cooperation balancing resource under-utilization and load congestion. Dual evaluation through theoretical and numerical ways shows that the developed solution achieves a globally optimal delay reduction performance compared to existing methods, which is also approved by the feasibility test over a real-map virtual environment. The in-depth analysis reveals that predicting the instantaneous processing power of edge servers facilitates the identification of overloaded servers, which is critical for determining network delay. By considering discrete variables of the queue, the proposed technique's precise estimation can effectively address these combinatorial challenges to achieve optimal performance.
Problem

Research questions and friction points this paper is trying to address.

Minimizes waiting delay for vehicular users through task offloading
Addresses resource under-utilization and load congestion in edge networks
Optimizes parallel computation queues to reduce network delay globally
Innovation

Methods, ideas, or system contributions that make the work stand out.

Parallel task execution strategy in VEC
Network cooperation balancing under-utilization and congestion
Predicting instantaneous server processing power
🔎 Similar Papers
2024-07-162024 7th International Conference on Information Communication and Signal Processing (ICICSP)Citations: 2
S
Sungho Cho
Department of Electrical and Computer Engineering, University of California, Los Angeles, Los Angeles, CA, 90024, USA
S
Sung Il Choi
School of Electrical Engineering, Korea University, Seoul 02841, South Korea
S
Seung Hyun Oh
School of Electrical Engineering, Korea University, Seoul 02841, South Korea
Ian P. Roberts
Ian P. Roberts
Assistant Professor, UCLA
wirelesscommunicationsfull-duplexsignal processing
S
Sang Hyun Lee
School of Electrical Engineering, Korea University, Seoul 02841, South Korea