🤖 AI Summary
This work addresses the challenge of analytically computing conditional densities in belief propagation for traditional Markov processes, which typically necessitates approximation or sampling due to intractability. The authors propose a novel modeling framework based on sparse sum-of-squares (SoS) functions, enabling analytical estimation of conditional densities and efficient belief propagation by explicitly enforcing non-negativity and normalization constraints through an SoS parameterization. The approach jointly learns basis functions and coefficients, integrating function approximation with constrained optimization. Experimental results demonstrate that the method achieves state-of-the-art accuracy with reduced memory consumption in low-dimensional systems and, for the first time, extends analytical belief propagation to 12 dimensions—substantially surpassing the prevailing two-dimensional limitation of existing techniques.
📝 Abstract
Harnessing the predictive capability of Markov process models requires propagating probability density functions (beliefs) through the model. For many existing models however, belief propagation is analytically infeasible, requiring approximation or sampling to generate predictions. This paper proposes a functional modeling framework leveraging sparse Sum-of-Squares (SoS) forms for valid (conditional) density estimation. We study the theoretical restrictions of modeling conditional densities using the SoS form, and propose a novel functional form for addressing such limitations. The proposed architecture enables generalized simultaneous learning of basis functions and coefficients, while preserving analytical belief propagation. In addition, we propose a training method that allows for exact adherence to the normalization and non-negativity constraints. Our results show that the proposed method achieves accuracy comparable to state-of-the-art approaches while requiring significantly less memory in low-dimensional spaces, and it further scales to 12D systems when existing methods fail beyond 2D.