InfoBridge: Mutual Information estimation via Bridge Matching

📅 2025-02-03
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the high bias and poor stability of mutual information (MI) estimation in high-dimensional, nonlinear, and low-sample regimes, this paper proposes the first unbiased MI estimator based on diffusion bridge models. Methodologically, it introduces diffusion bridge theory—previously unexplored in MI estimation—by solving a stochastic differential equation under prescribed boundary conditions to construct a theoretically guaranteed unbiased estimator; it further abandons explicit density-ratio modeling and instead adopts a contrastive learning framework for implicit distribution matching. On standard benchmarks, the proposed method significantly outperforms leading neural estimators—including MINE, JS-MI, and SMILE—reducing MI estimation error by 37%–62%. It also demonstrates superior sample efficiency and robustness across diverse settings, particularly under limited data and complex dependencies.

Technology Category

Application Category

📝 Abstract
Diffusion bridge models have recently become a powerful tool in the field of generative modeling. In this work, we leverage their power to address another important problem in machine learning and information theory - the estimation of the mutual information (MI) between two random variables. We show that by using the theory of diffusion bridges, one can construct an unbiased estimator for data posing difficulties for conventional MI estimators. We showcase the performance of our estimator on a series of standard MI estimation benchmarks.
Problem

Research questions and friction points this paper is trying to address.

Information Theory
Random Processes
Correlation
Innovation

Methods, ideas, or system contributions that make the work stand out.

InfoBridge
Diffusion Bridge Model
Mutual Information Estimation
🔎 Similar Papers
No similar papers found.