Causal Neural Probabilistic Circuits

📅 2026-03-01
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work proposes Causal Neural Probabilistic Circuits (CNPC), a novel approach that integrates causal probabilistic circuits into concept bottleneck models to address the limitation of ignoring causal dependencies among concepts during interventions. By combining neural attribute predictors with causal probabilistic circuits compiled from a causal graph, and fusing the predicted distribution with the marginal distribution under causal intervention via a product-of-experts mechanism, CNPC enables interpretable predictions that respect the underlying causal structure. The method supports exact and tractable causal intervention inference and includes a theoretical analysis of compositional intervention error. Experiments on five benchmark datasets demonstrate that CNPC significantly outperforms five baseline models under both in-distribution and out-of-distribution settings, achieving notably higher task accuracy in multi-attribute intervention scenarios.

Technology Category

Application Category

📝 Abstract
Concept Bottleneck Models (CBMs) enhance the interpretability of end-to-end neural networks by introducing a layer of concepts and predicting the class label from the concept predictions. A key property of CBMs is that they support interventions, i.e., domain experts can correct mispredicted concept values at test time to improve the final accuracy. However, typical CBMs apply interventions by overwriting only the corrected concept while leaving other concept predictions unchanged, which ignores causal dependencies among concepts. To address this, we propose the Causal Neural Probabilistic Circuit (CNPC), which combines a neural attribute predictor with a causal probabilistic circuit compiled from a causal graph. This circuit supports exact, tractable causal inference that inherently respects causal dependencies. Under interventions, CNPC models the class distribution based on a Product of Experts (PoE) that fuses the attribute predictor's predictive distribution with the interventional marginals computed by the circuit. We theoretically characterize the compositional interventional error of CNPC w.r.t. its modules and identify conditions under which CNPC closely matches the ground-truth interventional class distribution. Experiments on five benchmark datasets in both in-distribution and out-of-distribution settings show that, compared with five baseline models, CNPC achieves higher task accuracy across different numbers of intervened attributes.
Problem

Research questions and friction points this paper is trying to address.

Concept Bottleneck Models
causal dependencies
interventions
causal inference
interpretability
Innovation

Methods, ideas, or system contributions that make the work stand out.

Causal Neural Probabilistic Circuit
Concept Bottleneck Models
Causal Inference
Intervention
Product of Experts
🔎 Similar Papers
No similar papers found.