Probabilistic Circuits for Knowledge Graph Completion with Reduced Rule Sets

๐Ÿ“… 2025-08-08
๐Ÿ“ˆ Citations: 0
โœจ Influential: 0
๐Ÿ“„ PDF
๐Ÿค– AI Summary
Existing rule-based knowledge graph completion methods offer interpretability but rely on large rule sets, compromising both interpretability and efficiency. This paper introduces the concept of *rule context*โ€”a cohesive subset of collaboratively functioning rules modeled as a probabilistic circuitโ€”enabling traceable probabilistic inference without independence assumptions while preserving logical semantics. By learning the probability distribution over rule contexts, our approach drastically compresses the rule set while supporting both exact and approximate query probability estimation. Evaluated on eight benchmark datasets, our method achieves performance comparable to state-of-the-art baselines (e.g., AnyBURL) using only 4โ€“30% of their rules; with merely 4% of the rules, it retains 91% of peak accuracy. Moreover, inference speed improves by up to 31ร—. The framework thus unifies high efficiency, faithful interpretability, and rigorous probabilistic semantics.

Technology Category

Application Category

๐Ÿ“ Abstract
Rule-based methods for knowledge graph completion provide explainable results but often require a significantly large number of rules to achieve competitive performance. This can hinder explainability due to overwhelmingly large rule sets. We discover rule contexts (meaningful subsets of rules that work together) from training data and use learned probability distribution (i.e. probabilistic circuits) over these rule contexts to more rapidly achieve performance of the full rule set. Our approach achieves a 70-96% reduction in number of rules used while outperforming baseline by up to 31$ imes$ when using equivalent minimal number of rules and preserves 91% of peak baseline performance even when comparing our minimal rule sets against baseline's full rule sets. We show that our framework is grounded in well-known semantics of probabilistic logic, does not require independence assumptions, and that our tractable inference procedure provides both approximate lower bounds and exact probability of a given query. The efficacy of our method is validated by empirical studies on 8 standard benchmark datasets where we show competitive performance by using only a fraction of the rules required by AnyBURL's standard inference method, the current state-of-the-art for rule-based knowledge graph completion. This work may have further implications for general probabilistic reasoning over learned sets of rules.
Problem

Research questions and friction points this paper is trying to address.

Reducing rule set size for knowledge graph completion
Maintaining performance with minimal explainable rules
Using probabilistic circuits over rule contexts efficiently
Innovation

Methods, ideas, or system contributions that make the work stand out.

Uses probabilistic circuits over rule contexts
Reduces rule sets by 70-96% while maintaining performance
Provides tractable inference without independence assumptions
๐Ÿ”Ž Similar Papers
No similar papers found.