DeGAS: Gradient-Based Optimization of Probabilistic Programs without Sampling

πŸ“… 2026-01-21
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
This work addresses the challenges of slow convergence and inefficiency in conditional inference and optimization for probabilistic programs involving both continuous and discrete variables, which typically rely on sampling-based methods. The authors propose a differentiable Gaussian approximation semantics that integrates Gaussian mixture representations, smooth handling of measure-zero predicates, and automatic differentiation. This approach enables, for the first time, end-to-end, sampling-free gradient-based optimization of probabilistic programs with discrete branching and continuous variables, while preserving differentiability of path probabilities and posterior expressions. Evaluated on 13 benchmark programs, the method achieves accuracy and efficiency comparable to variational inference and MCMC, and demonstrates stable convergence in continuous conditional optimization tasks where sampling-based approaches fail.

Technology Category

Application Category

πŸ“ Abstract
We present DeGAS, a differentiable Gaussian approximate semantics for loopless probabilistic programs that enables sample-free, gradient-based optimization in models with both continuous and discrete components. DeGAS evaluates programs under a Gaussian-mixture semantics and replaces measure-zero predicates and discrete branches with a vanishing smoothing, yielding closed-form expressions for posterior and path probabilities. We prove differentiability of these quantities with respect to program parameters, enabling end-to-end optimization via standard automatic differentiation, without Monte Carlo estimators. On thirteen benchmark programs, DeGAS achieves accuracy and runtime competitive with variational inference and MCMC. Importantly, it reliably tackles optimization problems where sampling-based baselines fail to converge due to conditioning involving continuous variables.
Problem

Research questions and friction points this paper is trying to address.

probabilistic programs
gradient-based optimization
sampling-free inference
continuous and discrete variables
conditioning
Innovation

Methods, ideas, or system contributions that make the work stand out.

differentiable probabilistic programming
Gaussian approximation
sample-free optimization
gradient-based inference
smoothed discrete branching
πŸ”Ž Similar Papers
No similar papers found.
F
Francesca Randone
TU Wien, Wien, Austria
R
Romina Doz
University of Trieste, Trieste, Italy
M
M. Tribastone
IMT School for Advanced Studies Lucca, Lucca, Italy
Luca Bortolussi
Luca Bortolussi
UniversitΓ  di Trieste
modelling and simulationexplainable artificial intelligencemachine learningformal methodscyber-physical systems