Efficient Computation of Marton's Error Exponent via Constraint Decoupling

📅 2025-07-26
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Marton’s error exponent, a fundamental performance limit in source coding with side information, has long been intractable due to its underlying non-convex optimization formulation and the absence of efficient numerical solvers. Existing approaches—primarily two-dimensional grid search—suffer from high computational complexity and poor scalability. This paper proposes a constrained decoupling and composite maximization framework that decomposes the original problem into tractable convex subproblems. We design an alternating maximization algorithm integrating one-dimensional line search with convex optimization routines, enabling, for the first time, joint and globally convergent computation of both Marton’s error exponent and its inverse. Experiments on simple sources and the Ahlswede counterexample demonstrate speedups of over an order of magnitude compared to conventional grid search, significantly enhancing computational efficiency and scalability. The method provides a practical tool for performance analysis in active source coding.

Technology Category

Application Category

📝 Abstract
The error exponent in lossy source coding characterizes the asymptotic decay rate of error probability with respect to blocklength. The Marton's error exponent provides the theoretically optimal bound on this rate. However, computation methods of the Marton's error exponent remain underdeveloped due to its formulation as a non-convex optimization problem with limited efficient solvers. While a recent grid search algorithm can compute its inverse function, it incurs prohibitive computational costs from two-dimensional brute-force parameter grid searches. This paper proposes a composite maximization approach that effectively handles both Marton's error exponent and its inverse function. Through a constraint decoupling technique, the resulting problem formulations admit efficient solvers driven by an alternating maximization algorithm. By fixing one parameter via a one-dimensional line search, the remaining subproblem becomes convex and can be efficiently solved by alternating variable updates, thereby significantly reducing search complexity. Therefore, the global convergence of the algorithm can be guaranteed. Numerical experiments for simple sources and the Ahlswede's counterexample, demonstrates the superior efficiency of our algorithm in contrast to existing methods.
Problem

Research questions and friction points this paper is trying to address.

Computing Marton's error exponent efficiently
Solving non-convex optimization in source coding
Reducing computational cost in error exponent calculation
Innovation

Methods, ideas, or system contributions that make the work stand out.

Composite maximization approach for error exponent
Constraint decoupling technique for convex subproblems
Alternating maximization algorithm reducing search complexity
🔎 Similar Papers
J
Jiachuan Ye
Department of Mathematical Sciences, Tsinghua University, Beijing 100084, China
Shitong Wu
Shitong Wu
Tsinghua University
Optimal TransportInformation TheoryOptimization
L
Lingyi Chen
Department of Mathematical Sciences, Tsinghua University, Beijing 100084, China
W
Wenyi Zhang
Department of Electronic Engineering and Information Science, University of Science and Technology of China, Hefei, Anhui 230027, China
Huihui Wu
Huihui Wu
Ningbo Institute of Digital Twin
Data CompressionChannel CodingSemantic CommunicationsDeep Learning
H
Hao Wu
Department of Mathematical Sciences, Tsinghua University, Beijing 100084, China