Bregman geometry-aware split Gibbs sampling for Bayesian Poisson inverse problems

📅 2025-11-15
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses key challenges in Bayesian inference for Poisson inverse problems—namely, non-Lipschitz gradients, strict positivity constraints on latent variables, and preservation of posterior geometric structure. To this end, we propose a Monte Carlo sampling framework grounded in non-Euclidean geometry. Our method constructs a natural conjugate augmented posterior using the Bregman divergence induced by Burg entropy and introduces splitting variables for exact data augmentation. Building upon this, we design a Hessian Riemannian Langevin Monte Carlo (HRLMC) step on the mirror manifold, integrated with a splitting Gibbs strategy to explicitly enforce variable positivity and preserve intrinsic geometric properties of the posterior. Evaluated on Poisson denoising, deblurring, and PET image reconstruction tasks, the approach significantly outperforms state-of-the-art optimization and sampling methods, achieving superior statistical accuracy while maintaining geometric fidelity.

Technology Category

Application Category

📝 Abstract
This paper proposes a novel Bayesian framework for solving Poisson inverse problems by devising a Monte Carlo sampling algorithm which accounts for the underlying non-Euclidean geometry. To address the challenges posed by the Poisson likelihood -- such as non-Lipschitz gradients and positivity constraints -- we derive a Bayesian model which leverages exact and asymptotically exact data augmentations. In particular, the augmented model incorporates two sets of splitting variables both derived through a Bregman divergence based on the Burg entropy. Interestingly the resulting augmented posterior distribution is characterized by conditional distributions which benefit from natural conjugacy properties and preserve the intrinsic geometry of the latent and splitting variables. This allows for efficient sampling via Gibbs steps, which can be performed explicitly for all conditionals, except the one incorporating the regularization potential. For this latter, we resort to a Hessian Riemannian Langevin Monte Carlo (HRLMC) algorithm which is well suited to handle priors with explicit or easily computable score functions. By operating on a mirror manifold, this Langevin step ensures that the sampling satisfies the positivity constraints and more accurately reflects the underlying problem structure. Performance results obtained on denoising, deblurring, and positron emission tomography (PET) experiments demonstrate that the method achieves competitive performance in terms of reconstruction quality compared to optimization- and sampling-based approaches.
Problem

Research questions and friction points this paper is trying to address.

Develops Bayesian sampling for Poisson inverse problems with geometry awareness
Addresses non-Lipschitz gradients and positivity constraints in Poisson likelihood
Enables efficient reconstruction for denoising, deblurring, and PET imaging
Innovation

Methods, ideas, or system contributions that make the work stand out.

Bregman geometry-aware Gibbs sampling algorithm
Exact data augmentation via Burg entropy divergence
Hessian Riemannian Langevin Monte Carlo for regularization
🔎 Similar Papers
No similar papers found.
E
Elhadji Cisse Faye
Institut Denis Poisson, UMR CNRS University of Orléans, University of Tours, Orléans, France
M
Mame Diarra Fall
Univ Rouen Normandie, INSA Rouen Normandie, Université Le Havre Normandie, Normandie Univ, LITIS UR 4108, F-76000 Rouen, France
Nicolas Dobigeon
Nicolas Dobigeon
Professor, IRIT/INP-ENSEEIHT & ANITI, Univ. of Toulouse, France
signal processingimage processingmachine learningremote sensingastronomy
E
Eric Barat
CEA, University of Paris-Saclay, France