Robust DNN Partitioning and Resource Allocation Under Uncertain Inference Time

📅 2025-03-27
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Uncertain DNN inference latency in edge intelligence leads to deadline violations for time-critical tasks on mobile devices. Method: This paper proposes a joint optimization framework minimizing mobile device energy consumption, uniquely relying only on the mean and variance of inference latency—without assuming latency distribution or requiring predictive models. It formulates the probabilistic latency constraint as a chance constraint, then applies convex relaxation and a penalty-based convex–concave procedure (PCCP) to transform the original stochastic mixed-integer nonlinear program (MINLP) into efficiently solvable deterministic subproblems. The framework jointly optimizes the DNN partitioning point, local CPU/GPU operating frequencies, and uplink bandwidth allocation. Results: Experiments on real hardware with mainstream DNN models demonstrate that the method ensures ≥95% task completion within the probabilistic deadline while significantly reducing device energy consumption—achieving a 37% improvement in energy efficiency over baseline approaches.

Technology Category

Application Category

📝 Abstract
In edge intelligence systems, deep neural network (DNN) partitioning and data offloading can provide real-time task inference for resource-constrained mobile devices. However, the inference time of DNNs is typically uncertain and cannot be precisely determined in advance, presenting significant challenges in ensuring timely task processing within deadlines. To address the uncertain inference time, we propose a robust optimization scheme to minimize the total energy consumption of mobile devices while meeting task probabilistic deadlines. The scheme only requires the mean and variance information of the inference time, without any prediction methods or distribution functions. The problem is formulated as a mixed-integer nonlinear programming (MINLP) that involves jointly optimizing the DNN model partitioning and the allocation of local CPU/GPU frequencies and uplink bandwidth. To tackle the problem, we first decompose the original problem into two subproblems: resource allocation and DNN model partitioning. Subsequently, the two subproblems with probability constraints are equivalently transformed into deterministic optimization problems using the chance-constrained programming (CCP) method. Finally, the convex optimization technique and the penalty convex-concave procedure (PCCP) technique are employed to obtain the optimal solution of the resource allocation subproblem and a stationary point of the DNN model partitioning subproblem, respectively. The proposed algorithm leverages real-world data from popular hardware platforms and is evaluated on widely used DNN models. Extensive simulations show that our proposed algorithm effectively addresses the inference time uncertainty with probabilistic deadline guarantees while minimizing the energy consumption of mobile devices.
Problem

Research questions and friction points this paper is trying to address.

Minimize mobile device energy under uncertain DNN inference time
Optimize DNN partitioning and resource allocation without prediction
Ensure probabilistic task deadlines using robust optimization techniques
Innovation

Methods, ideas, or system contributions that make the work stand out.

Robust optimization for uncertain DNN inference time
Chance-constrained programming transforms probability constraints
Penalty convex-concave procedure solves partitioning subproblem
🔎 Similar Papers
No similar papers found.
Z
Zhaojun Nan
Beijing National Research Center for Information Science and Technology, Department of Electronic Engineering, Tsinghua University, Beijing 100084, China
Yunchu Han
Yunchu Han
Tsinghua University
Edge IntelligenceWireless CommunicationGreen AIMobile Edge Computing
S
Sheng Zhou
Beijing National Research Center for Information Science and Technology, Department of Electronic Engineering, Tsinghua University, Beijing 100084, China
Zhisheng Niu
Zhisheng Niu
Professor of Electronic Engineering, Tsinghua University
Green CommunicationRadio Resource ManagementQueueing Theory