The Gradient of Algebraic Model Counting

πŸ“… 2025-02-25
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
This paper addresses the lack of unified gradient computation, high memory overhead, and low optimization efficiency in statistical relational learning and neural-symbolic AI. We propose a differentiable logical inference framework grounded in algebraic model counting and generalized semirings. For the first time, we extend semiring algebraic structures to gradient computation, leveraging semiring cancellation properties and order structure to enable memory-efficient backpropagation, and supporting differentiable logical inference over tractable Boolean circuits such as d-DNNFs. Our approach unifies gradient computation paradigms across logic programming, probabilistic logic models (e.g., Markov Logic Networks), and neural-symbolic models, yielding novel first-order optimization algorithms. Experiments demonstrate significantly faster gradient computation compared to state-of-the-art methods. However, due to structural constraints inherent in second-order derivatives, Hessian-based optimization does not achieve corresponding speedup.

Technology Category

Application Category

πŸ“ Abstract
Algebraic model counting unifies many inference tasks on logic formulas by exploiting semirings. Rather than focusing on inference, we consider learning, especially in statistical-relational and neurosymbolic AI, which combine logical, probabilistic and neural representations. Concretely, we show that the very same semiring perspective of algebraic model counting also applies to learning. This allows us to unify various learning algorithms by generalizing gradients and backpropagation to different semirings. Furthermore, we show how cancellation and ordering properties of a semiring can be exploited for more memory-efficient backpropagation. This allows us to obtain some interesting variations of state-of-the-art gradient-based optimisation methods for probabilistic logical models. We also discuss why algebraic model counting on tractable circuits does not lead to more efficient second-order optimization. Empirically, our algebraic backpropagation exhibits considerable speed-ups as compared to existing approaches.
Problem

Research questions and friction points this paper is trying to address.

Unifies learning in statistical-relational AI
Generalizes gradients to different semirings
Enhances memory-efficient backpropagation methods
Innovation

Methods, ideas, or system contributions that make the work stand out.

Unifies learning with algebraic model counting
Generalizes gradients to various semirings
Enhances backpropagation efficiency using semiring properties
Jaron Maene
Jaron Maene
KU Leuven
neurosymbolic AIprobabilistic programming
L
L. D. Raedt
KU Leuven, Belgium; Γ–rebro University, Sweden