Spectral Imbalance Causes Forgetting in Low-Rank Continual Adaptation

📅 2026-01-31
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses catastrophic forgetting in low-rank continual adaptation, where imbalanced singular value spectra induce severe forward and backward interference. The study is the first to identify this spectral imbalance as a root cause of forgetting and proposes a novel constrained optimization framework that decouples the magnitude and direction of task-specific updates on the restricted Stiefel manifold. By explicitly enforcing spectral balance through this geometric constraint, the method mitigates catastrophic forgetting while remaining compatible with mainstream deep learning optimizers via a projected first-order optimization scheme. Extensive experiments across multiple continual learning benchmarks demonstrate that the proposed approach significantly outperforms existing low-rank adaptation strategies in both stability and accuracy.

Technology Category

Application Category

📝 Abstract
Parameter-efficient continual learning aims to adapt pre-trained models to sequential tasks without forgetting previously acquired knowledge. Most existing approaches treat continual learning as avoiding interference with past updates, rather than considering what properties make the current task-specific update naturally preserve previously acquired knowledge. From a knowledge-decomposition perspective, we observe that low-rank adaptations exhibit highly imbalanced singular value spectra: a few dominant components absorb most of the adaptation energy, thereby (i) more likely to disrupt previously acquired knowledge and (ii) making the update more vulnerable to interference from subsequent tasks. To enable explicit balance among components, we decouple the magnitude of the task update from its directional structure and formulate it as a constrained optimization problem on a restricted Stiefel manifold. We address this problem using a projected first-order method compatible with standard deep-learning optimizers used in vision-language models. Our method mitigates both backward and forward forgetting, consistently outperforming continual learning baselines. The implementation code is available at https://github.com/haodotgu/EBLoRA.
Problem

Research questions and friction points this paper is trying to address.

spectral imbalance
catastrophic forgetting
low-rank adaptation
continual learning
knowledge preservation
Innovation

Methods, ideas, or system contributions that make the work stand out.

spectral imbalance
low-rank adaptation
constrained optimization
Stiefel manifold
continual learning
🔎 Similar Papers
No similar papers found.
Hao Gu
Hao Gu
Sun Yat-Sen University
Planetary aeronomyAtmospheric escapeSpace physics
M
Mao-Lin Luo
School of Computer Science and Engineering, Southeast University, Nanjing 210096, China; Key Laboratory of Computer Network and Information Integration (Southeast University), Ministry of Education, China
Z
Zi-Hao Zhou
School of Computer Science and Engineering, Southeast University, Nanjing 210096, China; Key Laboratory of Computer Network and Information Integration (Southeast University), Ministry of Education, China
H
Han-Chen Zhang
School of Computer Science and Engineering, Southeast University, Nanjing 210096, China; Key Laboratory of Computer Network and Information Integration (Southeast University), Ministry of Education, China
Min-Ling Zhang
Min-Ling Zhang
Professor, School of Computer Science and Engineering, Southeast University, China
Artificial IntelligenceMachine LearningData Mining
Tong Wei
Tong Wei
Southeast University
Machine Learning