Frequency Switching Mechanism for Parameter-E!cient Multi-Task Learning

📅 2026-03-22
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the challenge that existing parameter-efficient fine-tuning methods struggle to achieve effective task-specific modeling under shared parameters. To overcome this limitation, the authors propose the Free Sinewich framework, which introduces a novel frequency-switching mechanism that element-wise modulates low-rank adapters via sinusoidal transformations. By integrating convolutional priors to enhance representational capacity and employing a Sine-AWB layer together with a Clock Net frequency controller, the framework generates task-specific weights with near-zero additional trainable parameters. Evaluated on dense prediction tasks, Free Sinewich significantly outperforms current state-of-the-art approaches—surpassing single-task fine-tuning by up to 5.39% while using only 6.53M trainable parameters—demonstrating an exceptional balance between performance and efficiency.

Technology Category

Application Category

📝 Abstract
Multi-task learning (MTL) aims to enable a single model to solve multiple tasks efficiently; however, current parameter-efficient fine-tuning (PEFT) methods remain largely limited to single-task adaptation. We introduce \textbf{Free Sinewich}, a parameter-efficient multi-task learning framework that enables near-zero-cost weight modulation via frequency switching (\textbf{Free}). Specifically, a \textbf{Sine-AWB (Sinewich)} layer combines low-rank factors and convolutional priors into a single kernel, which is then modulated elementwise by a sinusoidal transformation to produce task-specialized weights. A lightweight Clock Net is introduced to produce bounded frequencies that stabilize this modulation during training. Theoretically, sine modulation enhances the rank of low-rank adapters, while frequency separation decorrelates the weights of different tasks. On dense prediction benchmarks, Free Sinewich achieves state-of-the-art performance-efficiency trade-offs (e.g., up to +5.39\% improvement over single-task fine-tuning with only 6.53M trainable parameters), offering a compact and scalable paradigm based on frequency-based parameter sharing. Project page: \href{https://casperliuliuliu.github.io/projects/Free-Sinewich/}{https://casperliuliuliu.github.io/projects/Free-Sinewich}.
Problem

Research questions and friction points this paper is trying to address.

Multi-task learning
Parameter-efficient fine-tuning
Weight modulation
Task specialization
Frequency switching
Innovation

Methods, ideas, or system contributions that make the work stand out.

Frequency Switching
Parameter-Efficient Multi-Task Learning
Sine Modulation
Low-Rank Adaptation
Weight Modulation
🔎 Similar Papers
No similar papers found.