Cascaded Transfer: Learning Many Tasks under Budget Constraints

📅 2026-01-29
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the challenge of efficiently learning a large number of related tasks under resource constraints when task relationships are unknown. The authors propose a minimum spanning tree–based cascaded transfer learning framework that first constructs a rooted tree structure by measuring pairwise task distances and building a minimum spanning tree. Model parameters are then transferred sequentially along tree edges in a hierarchical order, with training budgets dynamically allocated during the process. This approach introduces, for the first time, a tree-guided mechanism for ordered knowledge transfer, enabling collaborative multi-task optimization while respecting resource limitations. Experimental results demonstrate that the proposed framework significantly outperforms existing methods on both synthetic and real-world multi-task datasets, achieving notable improvements in both predictive accuracy and resource efficiency.

Technology Category

Application Category

📝 Abstract
Many-Task Learning refers to the setting where a large number of related tasks need to be learned, the exact relationships between tasks are not known. We introduce the Cascaded Transfer Learning, a novel many-task transfer learning paradigm where information (e.g. model parameters) cascades hierarchically through tasks that are learned by individual models of the same class, while respecting given budget constraints. The cascade is organized as a rooted tree that specifies the order in which tasks are learned and refined. We design a cascaded transfer mechanism deployed over a minimum spanning tree structure that connects the tasks according to a suitable distance measure, and allocates the available training budget along its branches. Experiments on synthetic and real many-task settings show that the resulting method enables more accurate and cost effective adaptation across large task collections compared to alternative approaches.
Problem

Research questions and friction points this paper is trying to address.

Many-Task Learning
Budget Constraints
Transfer Learning
Task Relationships
Innovation

Methods, ideas, or system contributions that make the work stand out.

Cascaded Transfer Learning
Many-Task Learning
Budget Constraints
Minimum Spanning Tree
Hierarchical Task Adaptation
🔎 Similar Papers
No similar papers found.
E
Eloi Campagne
Centre Borelli, Université Paris-Saclay, CNRS, École Normale Supérieure Paris-Saclay, France; EDF R&D, Palaiseau, France
Y
Yvenn Amara-Ouali
EDF R&D, Palaiseau, France; Laboratoire de Mathématiques d'Orsay (LMO), Université Paris-Saclay, Faculté des Sciences d'Orsay, France
Yannig Goude
Yannig Goude
EDF R&D, LMO Université Paris-Saclay
Machine learning
Mathilde Mougeot
Mathilde Mougeot
Full Professor at ENSIIE & Researcher at Borelli Center, ENS Paris-Saclay
Data scienceMachine learning
Argyris Kalogeratos
Argyris Kalogeratos
Senior Research Scientinst, Centre Borelli, ENS Paris-Saclay, Université Paris-Saclay
Machine LearningArtificial IntelligenceComplex Networks