Minimax optimal adaptive structured transfer learning through semi-parametric domain-varying coefficient model

๐Ÿ“… 2026-02-20
๐Ÿ“ˆ Citations: 0
โœจ Influential: 0
๐Ÿ“„ PDF
๐Ÿค– AI Summary
This work addresses the challenge of negative transfer in multi-source single-target transfer learning when the conditional relationship between covariates and response varies across domains. To tackle this, the authors propose a semi-parametric Domain-Varying Coefficient Model (DVCM), which introduces varying-coefficient modeling into structured transfer learning for the first time. By incorporating domain indicators to capture heterogeneity in underlying mechanisms, DVCM enables a continuous spectrum of modelingโ€”from fully invariant to completely heterogeneous relationships. The resulting adaptive weighted estimator effectively integrates beneficial source information while avoiding negative transfer, achieving minimax-optimal convergence rates. Moreover, it supports asymptotic inference and uncertainty quantification. Both theoretical analysis and empirical experiments demonstrate the superiority of the proposed approach.

Technology Category

Application Category

๐Ÿ“ Abstract
Transfer learning aims to improve inference in a target domain by leveraging information from related source domains, but its effectiveness critically depends on how cross-domain heterogeneity is modeled and controlled. When the conditional mechanism linking covariates and responses varies across domains, indiscriminate information pooling can lead to negative transfer, degrading performance relative to target-only estimation. We study a multi-source, single-target transfer learning problem under conditional distributional drift and propose a semiparametric domain-varying coefficient model (DVCM), in which domain-relatedness is encoded through an observable domain identifier. This framework generalizes classical varying-coefficient models to structured transfer learning and interpolates between invariant and fully heterogeneous regimes. Building on this model, we develop an adaptive transfer learning estimator that selectively borrows strength from informative source domains while provably safeguarding against negative transfer. Our estimator is computationally efficient and easy to implement; we also show that it is minimax rate-optimal and derive its asymptotic distribution, enabling valid uncertainty quantification and hypothesis testing despite data-adaptive pooling and shrinkage. Our results precisely characterize the interplay among domain heterogeneity, the smoothness of the underlying mean function, and the number of source domains and are corroborated by comprehensive numerical experiments and two real-data applications.
Problem

Research questions and friction points this paper is trying to address.

transfer learning
domain heterogeneity
conditional distributional drift
negative transfer
multi-source adaptation
Innovation

Methods, ideas, or system contributions that make the work stand out.

transfer learning
domain-varying coefficient model
negative transfer
minimax optimality
semi-parametric modeling
๐Ÿ”Ž Similar Papers
No similar papers found.
H
Hanxiao Chen
Department of Mathematics and Statistics, Boston University
Debarghya Mukherjee
Debarghya Mukherjee
Assistant Professor, Boston University
Theoretical Statistics and Machine Learning