Differentiable Cyclic Causal Discovery Under Unmeasured Confounders

📅 2025-08-11
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Real-world causal systems often involve unobserved confounders and feedback loops, posing a fundamental challenge to conventional causal discovery methods that rely on strong assumptions of full observability and acyclicity. To address this, we propose DCCD-CONF—a novel framework for end-to-end differentiable learning of nonlinear, latent-variable–augmented, and cyclic causal graphs. Our approach jointly optimizes the graph structure and the latent confounder distribution via differentiable masking, neural networks for flexible nonlinear mechanism modeling, and an alternating maximum-likelihood optimization strategy. We provide theoretical guarantees on consistency under mild regularity conditions. Extensive experiments on synthetic benchmarks and real-world gene perturbation datasets demonstrate that DCCD-CONF achieves state-of-the-art performance in both causal graph recovery and latent confounder identification.

Technology Category

Application Category

📝 Abstract
Understanding causal relationships between variables is fundamental across scientific disciplines. Most causal discovery algorithms rely on two key assumptions: (i) all variables are observed, and (ii) the underlying causal graph is acyclic. While these assumptions simplify theoretical analysis, they are often violated in real-world systems, such as biological networks. Existing methods that account for confounders either assume linearity or struggle with scalability. To address these limitations, we propose DCCD-CONF, a novel framework for differentiable learning of nonlinear cyclic causal graphs in the presence of unmeasured confounders using interventional data. Our approach alternates between optimizing the graph structure and estimating the confounder distribution by maximizing the log-likelihood of the data. Through experiments on synthetic data and real-world gene perturbation datasets, we show that DCCD-CONF outperforms state-of-the-art methods in both causal graph recovery and confounder identification. Additionally, we also provide consistency guarantees for our framework, reinforcing its theoretical soundness.
Problem

Research questions and friction points this paper is trying to address.

Learning cyclic causal graphs with unmeasured confounders
Handling nonlinear relationships in causal discovery
Improving scalability and accuracy in confounder identification
Innovation

Methods, ideas, or system contributions that make the work stand out.

Differentiable learning of cyclic causal graphs
Handles unmeasured confounders with interventional data
Optimizes graph structure and confounder distribution
🔎 Similar Papers
No similar papers found.
M
Muralikrishnna G. Sethuraman
School of Electrical & Computer Engineering, Georgia Institute of Technology
Faramarz Fekri
Faramarz Fekri
Georgia Tech
Information TheoryWireless CommunicationNeuro-Symbolic AIGraphical ModelsReinforcement Learning