RDD Function: A Tradeoff Between Rate and Distortion-in-Distortion

๐Ÿ“… 2025-07-13
๐Ÿ“ˆ Citations: 0
โœจ Influential: 0
๐Ÿ“„ PDF
๐Ÿค– AI Summary
Cross-dimensional spaces lack natural metrics, making similarity definition and rate-distortion (RD) analysis fundamentally challenging. Method: This paper introduces the Rate Distortion-in-Distortion (RDD) framework, replacing the conventional expected distortion constraint in RD theory with a Gromov-type distortion measureโ€”thereby embedding Gromov distance into information-theoretic rate-distortion analysis for the first time. To address the high computational complexity of Gromov distance, we propose a decomposition-linearization-relaxation strategy, optimized via an alternating mirror descent algorithm. Contribution/Results: RDD is grounded in rigorous information-theoretic principles and supported by provable coding theorems; it reveals both fundamental connections to and key distinctions from classical RD theory. Experiments on classical sources and diverse grid-structured data demonstrate that RDD achieves substantial computational savings while maintaining state-of-the-art coding performance, establishing a novel paradigm for cross-space source coding.

Technology Category

Application Category

๐Ÿ“ Abstract
In this paper, we propose a novel function named Rate Distortion-in-Distortion (RDD) function as an extension of the classical rate-distortion (RD) function, where the expected distortion constraint is replaced by the Gromov-type distortion. This distortion, integral to the Gromov-Wasserstein (GW) distance, effectively defines the similarity in spaces of different dimensions without a direct metric between them. While our RDD function qualifies as an informational RD function, encoding theorems substantiate its status as an operational RD function, thereby underscoring its potential applicability in real-world source coding. Due to the high computational complexity associated with Gromov-type distortion, the RDD function cannot be solved analytically. Consequently, we develop an alternating mirror descent algorithm that significantly reduces computational complexity by employing decomposition, linearization, and relaxation techniques. Simulations on classical sources and different grids demonstrate the effectiveness of our algorithm. By examining the distinctions and connections between the RDD function and the RD function, we anticipate that RDD function will play a novel role in foreseeable future scenarios.
Problem

Research questions and friction points this paper is trying to address.

Extends rate-distortion function with Gromov-type distortion constraints
Reduces computational complexity via decomposition and relaxation techniques
Demonstrates applicability in real-world source coding scenarios
Innovation

Methods, ideas, or system contributions that make the work stand out.

Proposes RDD function extending classical RD function
Uses Gromov-Wasserstein distance for distortion metric
Develops alternating mirror descent algorithm for efficiency
๐Ÿ”Ž Similar Papers
No similar papers found.
L
Lingyi Chen
Department of Mathematical Sciences, Tsinghua University, Beijing 100084, China
H
Haoran Tang
Department of Mathematical Sciences, Tsinghua University, Beijing 100084, China
Shitong Wu
Shitong Wu
Tsinghua University
Optimal TransportInformation TheoryOptimization
Jiakun Liu
Jiakun Liu
Harbin Institute of Technology
Empirical Software EngineeringIntelligent Software Engineering
Huihui Wu
Huihui Wu
Ningbo Institute of Digital Twin
Data CompressionChannel CodingSemantic CommunicationsDeep Learning
W
Wenyi Zhang
Department of Electronic Engineering and Information Science, University of Science and Technology of China, Hefei, Anhui 230027, P.R. China
H
Hao Wu
Department of Mathematical Sciences, Tsinghua University, Beijing 100084, China