🤖 AI Summary
This work addresses directed acyclic graph (DAG) structure learning from zero-inflated count data. We propose ZICO, the first framework unifying zero-inflated generalized linear models (ZIGLMs) with continuous differentiable DAG optimization. ZICO jointly optimizes a smooth score-based objective via a differentiable acyclicity constraint (e.g., TRACE), L1 sparsity regularization, and fully vectorized mini-batch training. Crucially, we design a novel differentiable zero-inflated likelihood function, substantially improving computational efficiency and numerical stability for high-dimensional settings. Experiments demonstrate that ZICO achieves higher structural accuracy and faster convergence on synthetic benchmarks; in gene regulatory network inference, it matches or surpasses state-of-the-art methods; and it scales to thousands of variables, confirming practical applicability.
📝 Abstract
We address network structure learning from zero-inflated count data by casting each node as a zero-inflated generalized linear model and optimizing a smooth, score-based objective under a directed acyclic graph constraint. Our Zero-Inflated Continuous Optimization (ZICO) approach uses node-wise likelihoods with canonical links and enforces acyclicity through a differentiable surrogate constraint combined with sparsity regularization. ZICO achieves superior performance with faster runtimes on simulated data. It also performs comparably to or better than common algorithms for reverse engineering gene regulatory networks. ZICO is fully vectorized and mini-batched, enabling learning on larger variable sets with practical runtimes in a wide range of domains.