Bridging Lifelong and Multi-Task Representation Learning via Algorithm and Complexity Measure

📅 2025-11-03
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This paper addresses lifelong learning under dynamic task arrival and online data acquisition, focusing on leveraging latent shared structure to accelerate learning of new tasks. We propose a general online algorithmic framework based on multi-task empirical risk minimization, requiring no prior knowledge of the full task sequence. A key innovation is the introduction of the “task-disentanglement dimension” as a unified complexity measure—first theoretically characterizing representation sharing in both lifelong and multi-task learning, and enabling statistical analysis for arbitrary function classes. We establish an upper bound on sample complexity that depends explicitly on this dimension, demonstrating significant improvement over naive baselines. Experiments confirm strong generalization and efficient knowledge transfer across noisy classification and regression tasks.

Technology Category

Application Category

📝 Abstract
In lifelong learning, a learner faces a sequence of tasks with shared structure and aims to identify and leverage it to accelerate learning. We study the setting where such structure is captured by a common representation of data. Unlike multi-task learning or learning-to-learn, where tasks are available upfront to learn the representation, lifelong learning requires the learner to make use of its existing knowledge while continually gathering partial information in an online fashion. In this paper, we consider a generalized framework of lifelong representation learning. We propose a simple algorithm that uses multi-task empirical risk minimization as a subroutine and establish a sample complexity bound based on a new notion we introduce--the task-eluder dimension. Our result applies to a wide range of learning problems involving general function classes. As concrete examples, we instantiate our result on classification and regression tasks under noise.
Problem

Research questions and friction points this paper is trying to address.

Bridging lifelong and multi-task learning through shared representation
Developing algorithm for online learning with existing knowledge utilization
Establishing sample complexity bounds using task-eluder dimension
Innovation

Methods, ideas, or system contributions that make the work stand out.

Multi-task empirical risk minimization for lifelong learning
Introduces task-eluder dimension for complexity analysis
Applies to classification and regression under noise
🔎 Similar Papers