Approximate Bayesian Inference via Bitstring Representations

📅 2025-08-19
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the challenge of performing efficient probabilistic inference in quantized (low-precision) discrete parameter spaces to learn continuous distributions—a setting where conventional Bayesian inference struggles due to incompatibility with discrete hardware. We propose the first approximate Bayesian inference framework tailored for quantized parameter spaces: model parameters are encoded as bitstrings; structured priors are modeled via coupled probabilistic circuits; and a variational inference algorithm is explicitly designed for the discrete domain. Our approach unifies low-precision computation with principled probabilistic modeling, substantially improving inference efficiency and scalability while preserving model interpretability. Extensive evaluation on multiple quantized neural networks demonstrates that our method achieves significant speedups over baseline approaches—without sacrificing predictive accuracy—thereby establishing a novel paradigm for interpretable Bayesian learning on edge devices.

Technology Category

Application Category

📝 Abstract
The machine learning community has recently put effort into quantized or low-precision arithmetics to scale large models. This paper proposes performing probabilistic inference in the quantized, discrete parameter space created by these representations, effectively enabling us to learn a continuous distribution using discrete parameters. We consider both 2D densities and quantized neural networks, where we introduce a tractable learning approach using probabilistic circuits. This method offers a scalable solution to manage complex distributions and provides clear insights into model behavior. We validate our approach with various models, demonstrating inference efficiency without sacrificing accuracy. This work advances scalable, interpretable machine learning by utilizing discrete approximations for probabilistic computations.
Problem

Research questions and friction points this paper is trying to address.

Enables probabilistic inference in discrete parameter spaces
Learns continuous distributions using quantized representations
Provides scalable and interpretable machine learning solutions
Innovation

Methods, ideas, or system contributions that make the work stand out.

Discrete parameter space probabilistic inference
Tractable learning via probabilistic circuits
Discrete approximations for scalable computations
🔎 Similar Papers
No similar papers found.