๐ค AI Summary
Traditional information bottleneck (IB) methods rely solely on statistical correlations to construct abstractions, failing to preserve causal structure and thus limiting their applicability to causal tasks. To address this, we propose the Causal Information Bottleneck (CIB), the first IB framework explicitly extended to the causal domain: it compresses the source variable while directly optimizing for causal control over the targetโrather than mere predictive accuracy. Grounded in causal graphical models, CIB jointly models mutual information, conditional mutual information, and interventional distributions, enabling differentiable optimization via variational inference. Theoretically, CIB abstractions satisfy causal sufficiency. Empirically, on synthetic and semi-real datasets, CIB accurately identifies essential causal pathways and significantly outperforms standard IB and state-of-the-art causal baselines in both intervention reasoning and causal discovery tasks.
๐ Abstract
To effectively study complex causal systems, it is often useful to construct abstractions of parts of the system by discarding irrelevant details while preserving key features. The Information Bottleneck (IB) method is a widely used approach to construct variable abstractions by compressing random variables while retaining predictive power over a target variable. Traditional methods like IB are purely statistical and ignore underlying causal structures, making them ill-suited for causal tasks. We propose the Causal Information Bottleneck (CIB), a causal extension of the IB, which compresses a set of chosen variables while maintaining causal control over a target variable. This method produces abstractions of (sets of) variables which are causally interpretable, give us insight about the interactions between the abstracted variables and the target variable, and can be used when reasoning about interventions. We present experimental results demonstrating that the learned abstractions accurately capture causal relations as intended.