Equilibrium contrastive learning for imbalanced image classification

📅 2026-02-10
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the limited generalization performance of existing contrastive learning methods in long-tailed image classification, which stems from misalignment between class prototypes and classifier weights as well as imbalanced contributions across categories. To overcome these issues, we propose Equilibrium Contrastive Learning (ECL), a novel framework that simultaneously balances the contributions of class-averaged features and prototypes while enforcing geometric equilibrium constraints to align classifier weights with class prototypes. This joint optimization of the representation space and classifier encourages feature distributions to form a regular simplex structure. Extensive experiments on four long-tailed benchmarks—CIFAR-10(0)-LT, ImageNet-LT, ISIC 2019, and LCCT—demonstrate that ECL significantly outperforms current state-of-the-art methods, achieving substantial improvements in classification accuracy under imbalanced scenarios.

Technology Category

Application Category

📝 Abstract
Contrastive learning (CL) is a predominant technique in image classification, but they showed limited performance with an imbalanced dataset. Recently, several supervised CL methods have been proposed to promote an ideal regular simplex geometric configuration in the representation space-characterized by intra-class feature collapse and uniform inter-class mean spacing, especially for imbalanced datasets. In particular, existing prototype-based methods include class prototypes, as additional samples to consider all classes. However, the existing CL methods suffer from two limitations. First, they do not consider the alignment between the class means/prototypes and classifiers, which could lead to poor generalization. Second, existing prototype-based methods treat prototypes as only one additional sample per class, making their influence depend on the number of class instances in a batch and causing unbalanced contributions across classes. To address these limitations, we propose Equilibrium Contrastive Learning (ECL), a supervised CL framework designed to promote geometric equilibrium, where class features, means, and classifiers are harmoniously balanced under data imbalance. The proposed ECL framework uses two main components. First, ECL promotes the representation geometric equilibrium (i.e., a regular simplex geometry characterized by collapsed class samples and uniformly distributed class means), while balancing the contributions of class-average features and class prototypes. Second, ECL establishes a classifier-class center geometric equilibrium by aligning classifier weights and class prototypes. We ran experiments with three long-tailed datasets, the CIFAR-10(0)-LT, ImageNet-LT, and the two imbalanced medical datasets, the ISIC 2019 and our constructed LCCT dataset. Results show that ECL outperforms existing SOTA supervised CL methods designed for imbalanced classification.
Problem

Research questions and friction points this paper is trying to address.

imbalanced image classification
contrastive learning
class prototypes
geometric equilibrium
long-tailed datasets
Innovation

Methods, ideas, or system contributions that make the work stand out.

Equilibrium Contrastive Learning
imbalanced classification
regular simplex geometry
prototype alignment
geometric equilibrium
🔎 Similar Papers
No similar papers found.
S
Sumin Roh
Department of Electrical and Computer Engineering (ECE), Sungkyunkwan University (SKKU), Suwon 16419, South Korea
H
Harim Kim
Department of Radiology, Samsung Medical Center (SMC), Seoul 06351, South Korea
H
Ho Yun Lee
Department of Radiology, SMC, Seoul 06351, South Korea, and School of Medicine, SKKU, Suwon 16419, South Korea
Il Yong Chun
Il Yong Chun
Associate Professor of EEE, AI, ECE, ADE, SCE, DCE, & CNIR, Sungkyunkwan University
Artificial intelligenceComputer visionComputational imaging