TensLoRA: Tensor Alternatives for Low-Rank Adaptation

📅 2025-09-22
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing LoRA methods treat low-rank updates for individual layers and attention branches (Q/K/V) as independent matrices, neglecting structural correlations across layers and branches. To address this, we propose TensLoRA—a systematic tensorized low-rank adaptation framework. TensLoRA unifies all LoRA updates into a high-order tensor representation and leverages tensor decomposition to jointly model inter-layer and inter-branch dependencies. This formulation enables modality-aware customization of compression ratios, enhancing both parameter allocation flexibility and efficiency. Evaluated on multimodal vision-language benchmarks, TensLoRA consistently outperforms standard LoRA at equivalent parameter counts, demonstrating that explicit tensor structure intrinsically strengthens adaptation capability. Our work establishes the first principled tensor-based approach to low-rank adaptation, bridging structural modeling gaps in parameter-efficient fine-tuning.

Technology Category

Application Category

📝 Abstract
Low-Rank Adaptation (LoRA) is widely used to efficiently adapt Transformers by adding trainable low-rank matrices to attention projections. While effective, these matrices are considered independent for each attention projection (Query, Key, and Value) and each layer. Recent extensions have considered joint, tensor-based adaptations, but only in limited forms and without a systematic framework. We introduce TensLoRA, a unified framework that aggregates LoRA updates into higher-order tensors and models a broad family of tensor-based low-rank adaptations. Our formulation generalizes existing tensor-based methods and enables mode-specific compression rates, allowing parameter budgets to be tailored according to the modality and task. Experiments on vision and language benchmarks reveal that the tensor construction directly impacts performance, sometimes better than standard LoRA under similar parameter counts.
Problem

Research questions and friction points this paper is trying to address.

Unifies LoRA updates into tensor framework for joint adaptation
Enables mode-specific compression rates tailored to modality
Systematically models tensor alternatives to independent matrices
Innovation

Methods, ideas, or system contributions that make the work stand out.

Unified tensor framework aggregates LoRA updates
Generalizes tensor methods enabling mode-specific compression
Tensor construction impacts performance under similar parameters
🔎 Similar Papers
No similar papers found.
A
Axel Marmoret
IMT Atlantique, Lab-STICC, UMR CNRS 6285, F-29238 Brest, France
R
Reda Bensaid
IMT Atlantique, Lab-STICC, UMR CNRS 6285, F-29238 Brest, France
J
Jonathan Lys
IMT Atlantique, Lab-STICC, UMR CNRS 6285, F-29238 Brest, France
Vincent Gripon
Vincent Gripon
IMT Atlantique and Lab-STICC
Deep LearningFew-Shot LearningArtificial Intelligence
F
François Leduc-Primeau
Polytechnique Montréal, Canada