Group-realizable multi-group learning by minimizing empirical risk

📅 2026-01-23
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the challenge of high sample complexity in multi-group learning under the agnostic setting, particularly when the group family is infinite, which typically undermines generalization guarantees. Focusing instead on the group-realizable setting, the paper provides the first analysis showing that empirical risk minimization (ERM) can achieve significantly reduced sample complexity—even with an infinite group family—provided its VC dimension is finite. The study further demonstrates that standard ERM is computationally infeasible in this context and proposes an efficient alternative based on improper learning. By establishing tighter upper bounds on sample complexity while maintaining theoretical rigor, this work offers both improved theoretical understanding and a practical algorithmic pathway for multi-group learning.

Technology Category

Application Category

📝 Abstract
The sample complexity of multi-group learning is shown to improve in the group-realizable setting over the agnostic setting, even when the family of groups is infinite so long as it has finite VC dimension. The improved sample complexity is obtained by empirical risk minimization over the class of group-realizable concepts, which itself could have infinite VC dimension. Implementing this approach is also shown to be computationally intractable, and an alternative approach is suggested based on improper learning.
Problem

Research questions and friction points this paper is trying to address.

multi-group learning
group-realizable
sample complexity
VC dimension
empirical risk minimization
Innovation

Methods, ideas, or system contributions that make the work stand out.

group-realizable learning
multi-group learning
sample complexity
empirical risk minimization
improper learning
🔎 Similar Papers
No similar papers found.