Radial-VCReg: More Informative Representation Learning Through Radial Gaussianization

📅 2026-02-15
📈 Citations: 1
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the challenge in self-supervised learning that high-dimensional representations are difficult to explicitly maximize in terms of mutual information, and existing methods often fail to fully achieve maximum entropy. To this end, the authors propose a radial Gaussianization loss that aligns the feature norms with a chi-squared distribution, thereby expanding the class of feature distributions amenable to transformation into a standard normal distribution. This approach effectively attenuates higher-order dependencies and enhances representation diversity. Integrated into the VCReg framework, the method optimizes the statistical properties of features to more comprehensively approximate a high-dimensional Gaussian distribution. Experiments demonstrate significant improvements in both the informativeness and discriminability of learned representations on both synthetic and real-world datasets.

Technology Category

Application Category

📝 Abstract
Self-supervised learning aims to learn maximally informative representations, but explicit information maximization is hindered by the curse of dimensionality. Existing methods like VCReg address this by regularizing first and second-order feature statistics, which cannot fully achieve maximum entropy. We propose Radial-VCReg, which augments VCReg with a radial Gaussianization loss that aligns feature norms with the Chi distribution-a defining property of high-dimensional Gaussians. We prove that Radial-VCReg transforms a broader class of distributions towards normality compared to VCReg and show on synthetic and real-world datasets that it consistently improves performance by reducing higher-order dependencies and promoting more diverse and informative representations.
Problem

Research questions and friction points this paper is trying to address.

self-supervised learning
representation learning
maximum entropy
curse of dimensionality
higher-order dependencies
Innovation

Methods, ideas, or system contributions that make the work stand out.

Radial Gaussianization
Self-supervised learning
Representation learning
VCReg
Chi distribution
🔎 Similar Papers
No similar papers found.