ViRN: Variational Inference and Distribution Trilateration for Long-Tailed Continual Representation Learning

📅 2025-07-23
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Continual learning under long-tailed distributions suffers from catastrophic forgetting of old knowledge and poor adaptation to novel classes—especially tail classes. To address these dual challenges, we propose ViRN, the first framework integrating variational inference with distributional triangulation in continual learning. ViRN employs a variational autoencoder to model class-conditional distributions and leverages Wasserstein-distance-guided neighborhood retrieval and geometric fusion to reconstruct discriminative representations for tail classes. Additionally, a representation alignment mechanism is introduced to enforce cross-task semantic consistency. By jointly optimizing representation stability and plasticity, ViRN effectively breaks the stability–plasticity trade-off. Extensive experiments on six long-tailed image and speech classification benchmarks demonstrate that ViRN achieves an average accuracy improvement of 10.24% over state-of-the-art methods, establishing new performance benchmarks.

Technology Category

Application Category

📝 Abstract
Continual learning (CL) with long-tailed data distributions remains a critical challenge for real-world AI systems, where models must sequentially adapt to new classes while retaining knowledge of old ones, despite severe class imbalance. Existing methods struggle to balance stability and plasticity, often collapsing under extreme sample scarcity. To address this, we propose ViRN, a novel CL framework that integrates variational inference (VI) with distributional trilateration for robust long-tailed learning. First, we model class-conditional distributions via a Variational Autoencoder to mitigate bias toward head classes. Second, we reconstruct tail-class distributions via Wasserstein distance-based neighborhood retrieval and geometric fusion, enabling sample-efficient alignment of tail-class representations. Evaluated on six long-tailed classification benchmarks, including speech (e.g., rare acoustic events, accents) and image tasks, ViRN achieves a 10.24% average accuracy gain over state-of-the-art methods.
Problem

Research questions and friction points this paper is trying to address.

Addresses continual learning with long-tailed data distributions
Balances stability and plasticity under extreme sample scarcity
Improves accuracy for rare classes via variational inference
Innovation

Methods, ideas, or system contributions that make the work stand out.

Variational Autoencoder models class-conditional distributions
Wasserstein distance retrieves and fuses tail-class distributions
Geometric fusion aligns tail-class representations efficiently
🔎 Similar Papers
No similar papers found.