Information Subtraction: Learning Representations for Conditional Entropy

๐Ÿ“… 2025-01-02
๐Ÿ“ˆ Citations: 0
โœจ Influential: 0
๐Ÿ“„ PDF
๐Ÿค– AI Summary
This work addresses the challenge of modeling conditional entropy and conditional mutual information for continuous sensitive variables. We propose Info-Subtractionโ€”the first conditional information disentanglement framework tailored to continuous variables. Methodologically, it leverages a generative information bottleneck to jointly optimize the target mutual information $I(X;Z)$ and sensitive mutual information $I(S;Z)$ in the latent space, while introducing a novel continuous conditional contrastive mechanism that enables iterative extraction of semantic features such as conditional entropy. Our key contributions are: (i) the first theoretical extension of conditional information disentanglement to continuous-variable settings, yielding a principled, interpretable foundation for fair representation learning; (ii) significant performance gains on fair classification and cross-domain recognition benchmarks; and (iii) empirically verified high-fidelity encoding of conditional entropy semantics in the learned representations. The implementation is open-sourced and has been adopted by the research community.

Technology Category

Application Category

๐Ÿ“ Abstract
The representations of conditional entropy and conditional mutual information are significant in explaining the unique effects among variables. While previous studies based on conditional contrastive sampling have effectively removed information regarding discrete sensitive variables, they have not yet extended their scope to continuous cases. This paper introduces Information Subtraction, a framework designed to generate representations that preserve desired information while eliminating the undesired. We implement a generative-based architecture that outputs these representations by simultaneously maximizing an information term and minimizing another. With its flexibility in disentangling information, we can iteratively apply Information Subtraction to represent arbitrary information components between continuous variables, thereby explaining the various relationships that exist between them. Our results highlight the representations' ability to provide semantic features of conditional entropy. By subtracting sensitive and domain-specific information, our framework demonstrates effective performance in fair learning and domain generalization. The code for this paper is available at https://github.com/jh-liang/Information-Subtraction
Problem

Research questions and friction points this paper is trying to address.

Conditional Entropy
Conditional Mutual Information
Fair Learning
Innovation

Methods, ideas, or system contributions that make the work stand out.

Information Subtraction
Continuous Variables
Fair Learning
๐Ÿ”Ž Similar Papers
No similar papers found.