🤖 AI Summary
This work addresses the challenge of efficiently learning a large number of related tasks under resource constraints when task relationships are unknown. The authors propose a minimum spanning tree–based cascaded transfer learning framework that first constructs a rooted tree structure by measuring pairwise task distances and building a minimum spanning tree. Model parameters are then transferred sequentially along tree edges in a hierarchical order, with training budgets dynamically allocated during the process. This approach introduces, for the first time, a tree-guided mechanism for ordered knowledge transfer, enabling collaborative multi-task optimization while respecting resource limitations. Experimental results demonstrate that the proposed framework significantly outperforms existing methods on both synthetic and real-world multi-task datasets, achieving notable improvements in both predictive accuracy and resource efficiency.
📝 Abstract
Many-Task Learning refers to the setting where a large number of related tasks need to be learned, the exact relationships between tasks are not known. We introduce the Cascaded Transfer Learning, a novel many-task transfer learning paradigm where information (e.g. model parameters) cascades hierarchically through tasks that are learned by individual models of the same class, while respecting given budget constraints. The cascade is organized as a rooted tree that specifies the order in which tasks are learned and refined. We design a cascaded transfer mechanism deployed over a minimum spanning tree structure that connects the tasks according to a suitable distance measure, and allocates the available training budget along its branches. Experiments on synthetic and real many-task settings show that the resulting method enables more accurate and cost effective adaptation across large task collections compared to alternative approaches.