Two Is Better Than One: Rotations Scale LoRAs

📅 2025-05-29
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Traditional weighted-sum gating mechanisms in multi-LoRA fusion suffer from limited expressivity, poor generalization, and underfitting, hindering the scalability of large language models (LLMs). To address this, we propose RadarGate: the first geometric gating mechanism based on learnable rotation matrices, which models collaborative relationships among LoRAs in angular space. RadarGate introduces comparative gating—adaptively aligning semantically similar representations while separating dissimilar ones—thereby overcoming the geometric expressivity bottleneck inherent in linear combinations. Our method integrates rotation-augmented LoRA representations, a geometric gating architecture, and a low-rank adaptive Mixture-of-Experts (MoE) framework. Evaluated across six public benchmarks and 21 diverse tasks, RadarGate significantly improves multi-LoRA scalability and effectively mitigates performance degradation caused by increasing expert count.

Technology Category

Application Category

📝 Abstract
Scaling Low-Rank Adaptation (LoRA)-based Mixture-of-Experts (MoE) facilitates large language models (LLMs) to efficiently adapt to diverse tasks. However, traditional gating mechanisms that route inputs to the best experts may fundamentally hinder LLMs' scalability, leading to poor generalization and underfitting issues. We identify that the root cause lies in the restricted expressiveness of existing weighted-sum mechanisms, both within and outside the convex cone of LoRA representations. This motivates us to propose RadarGate, a novel geometrically inspired gating method that introduces rotational operations of LoRAs representations to boost the expressiveness and facilitate richer feature interactions among multiple LoRAs for scalable LLMs. Specifically, we first fuse each LoRA representation to other LoRAs using a learnable component and then feed the output to a rotation matrix. This matrix involves learnable parameters that define the relative angular relationship between LoRA representations. Such a simple yet effective mechanism provides an extra degree of freedom, facilitating the learning of cross-LoRA synergies and properly tracking the challenging poor generalization and underfitting issues as the number of LoRA grows. Extensive experiments on 6 public benchmarks across 21 tasks show the effectiveness of our RadarGate for scaling LoRAs. We also provide valuable insights, revealing that the rotations to each pair of representations are contrastive, encouraging closer alignment of semantically similar representations during geometrical transformation while pushing distance ones further apart. We will release our code to the community.
Problem

Research questions and friction points this paper is trying to address.

Improving scalability of LoRA-based MoE in LLMs
Addressing poor generalization in traditional gating mechanisms
Enhancing expressiveness via rotational LoRA representations
Innovation

Methods, ideas, or system contributions that make the work stand out.

Introduces rotational operations for LoRA representations
Uses learnable rotation matrix for angular relationships
Enhances cross-LoRA synergies and generalization
🔎 Similar Papers
Hongcan Guo
Hongcan Guo
BUPT, ByteDance Seed
Large Language ModelReinforcement LearningMixture of ExpertsDiffusion Model
G
Gu Nan
Beijing University of Posts and Telecommunications, China
Y
Yuan Yang
Beijing University of Posts and Telecommunications, China
D
Diyang Zhang
Beijing University of Posts and Telecommunications, China
H
Haotian Li
Beijing University of Posts and Telecommunications, China
Z
Zhican Chen
Beijing University of Posts and Telecommunications, China
Q
Qinchuan Zhou
Beijing University of Posts and Telecommunications, China
Y
Yuhan Ran
University of Bristol, UK
X
Xinye Cao
Beijing University of Posts and Telecommunications, China
Sicong Leng
Sicong Leng
Nanyang Technological University
Multi-modal Learning
Xiaofeng Tao
Xiaofeng Tao
Beijing University of Posts and Telecommunications
wireless communication
X
Xudong Jiang
Nanyang Technological University, Singapore